var/home/core/zuul-output/0000755000175000017500000000000015130476506014534 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015130507450015471 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000174117315130507370020265 0ustar corecorebikubelet.log_o[;r)Br'o b-n(!9t%Cs7}g/غIs,r.k9GfD %/ 3I_翪|mvşo#oVݏKf+ovpZj-b6"οƼ>UWm׫Y_?|uݗ[y[L-V_pY_P-bXwûxwAۋt[~ _P^~&RY,yDy~z]fs,l<L& " d :o5J=nJw1f /%\xiƙQʀClxv< |N ?%5$) y5o? fۮ?tT)x[@Y[`VQYY0gr.W9{r&r%LӶ`zV=Tooz2¨(PQ wFh k0&S V3M.*x6Ql"%qYHzn4}*|dd#)3c 0'Jw A57&Q"ԉQIF$%* 4B.K$*/Gmt΍L/1/ <Je63I[wdt6o[ .`:J ]HmS>v5gCh31 )Kh3i J1hG{aD4iӌçN/e] o;iF]u54!h/9Y@$9GAOI=2,!N{\00{B"唄(".V.U) _.f*g,Z0>?<;~9.뙘 vKAb;-$JRPţ*描Լf^`iwoW~wSL2uQO)qai]>yE*,?k 9Z29}}(4ҲIFyG -^W6yY<*uvf d |TRZ;j?| |!I糓 sw`{s0Aȶ9W E%*mG:tëoG(;h0!}qfJz硂Ϧ4Ck9]٣Z%T%x~5r.N`$g`Խ!:*Wni|QXj0NbYe獸]fNdƭwq <ć;_ʧNs9[(=!@Q,}s=LN YlYd'Z;o.K'[-הp|A*Z*}QJ0SqAYE0i5P-$̿<_d^"]}Z|-5rC wjof'(%*݅^J">CMMQQ؏*ΧL ߁NPi?$;g&立q^-:}KA8Nnn6C;XHK:lL4Aْ .vqHP"P.dTrcD Yjz_aL_8};\N<:R€ N0RQ⚮FkeZ< )VCRQrC|}nw_~ܥ0~fgKAw^};fs)1K MޠPBUB1J{Ⱦ79`®3uO0T-Oy+tǭQI%Q$SiJ. 9F[L1c!zG|k{kEu+Q & "> 3J?5OͩLH.:;ߡ֖QʡCOx]*9W C;6)SCVOאUʇq )$ {SG!pN7,/M(.ΰdƛޜP16$ c:!%Piocej_H!CEF L훨bِp{!*({bʂAtĘ5dw9}ŒEanvVZ?C}!w,ƍͩ?9} [oF2(Y}Q7^{E}xA|AŜt;y}=W<*e'&Ж0(ݕ`{az^su/x)W>OK(BSsǽҰ%>kh5nIYk'LVc(a<1mCޢmp.֣?5t罦X[nMcow&|||x:k/.EoV%#?%W۱`3fs䓯ҴgqmubIfp$HhtLzܝ6rq/nLN?2Ǒ|;C@,UѩJ:|n^/GSZ;m#Nvd?PqTcLQMhg:F[bTm!V`AqPaPheUJ& z?NwpGj{VjQS,؃I'[y~EQ(S +mpN, Mq 70eP/d bP6k:Rǜ%V1Ȁ Z(Q:IZaP,MI6o ޞ22ݡjR:g?m@ڤB^dh NS߿c9e#C _-XѪ;Ʃ2tStΆ,~Lp`-;uIBqBVlU_~F_+ERz#{)@o\!@q['&&$"THl#d0 %L+`8zOҚƞ`wF~;~pkѽ)'cL@i]<ք6ym®Yi&s`dyMX](^!#h k:U7Uv7쿻чd)wB5v-)s蓍\>S[l52, 5 CۈP$0Zg=+DJ%D  *NpJ֊iTv)vtT̅Rhɇ ќuގ¢6}#LpFD58LQ LvqZDOF_[2ahwfm#Y~!%rpWMEWMjbn(ek~iQ)à/2,?O 0D"\KjPQ>Y{Ÿ>14`SČ.HPdp12 (7 _:+$ߗv{wzM$VbήdsOw<}#b[E7imH'Y`;5{$ь'gISzp; AQvDIyHc<槔w w?38v?Lsb s "NDr3\{J KP/ߢ/emPW֦?>Y5p&nr0:9%Ws$Wc0FS=>Qp:!DE5^9-0 R2ڲ]ew۵jI\'iħ1 {\FPG"$$ {+!˨?EP' =@~edF \r!٤ã_e=P1W3c +A)9V ]rVmeK\4? 8'*MTox6[qn2XwK\^-ޖA2U]E_Dm5^"d*MQǜq؈f+C/tfRxeKboc5Iv{K TV}uuyk s" &ﱏҞO/ont~]5\ʅSHwӍq6Ung'!! e#@\YV,4&`-6 E=߶EYE=P?~݆]Ōvton5 lvǫV*k*5]^RFlj]R#Uz |wmTeM kuu8@8/X[1fiMiT+9[ŗ6 BN=rR60#tE#u2k *+e7[YU6Msj$wբh+8kMZY9X\u7Kp:׽ ^҃5M>!6~ö9M( Pnuݮ)`Q6eMӁKzFZf;5IW1i[xU 0FPM]gl}>6sUDO5f p6mD[%ZZvm̓'!n&.TU n$%rIwP(fwnv :Nb=X~ax`;Vw}wvRS1q!z989ep 5w%ZU.]5`s=r&v2FaUM 6/"IiBSpp3n_9>Byݝ0_5bZ8ւ 6{Sf觋-V=Oߖm!6jm3Kx6BDhvzZn8hSlz z6^Q1* _> 8A@>!a:dC<mWu[7-D[9)/*˸PP!j-7BtK|VXnT&eZc~=31mס̈'K^r,W˲vtv|,SԽ[qɑ)6&vד4G&%JLi[? 1A ۥ͟յt9 ",@9 P==s 0py(nWDwpɡ`i?E1Q!:5*6@q\\YWTk sspww0SZ2, uvao=\Sl Uݚu@$Pup՗з҃TXskwqRtYڢLhw KO5C\-&-qQ4Mv8pS俺kCߤ`ZnTV*P,rq<-mOK[[ߢm۽ȑt^, tJbظ&Pg%㢒\QS܁vn` *3UP0Sp8:>m(Zx ,c|!0=0{ P*27ެT|A_mnZ7sDbyT'77J6:ѩ> EKud^5+mn(fnc.^xt4gD638L"!}LpInTeD_1ZrbkI%8zPU:LNTPlI&N:o&2BVb+uxZ`v?7"I8hp A&?a(8E-DHa%LMg2:-ŷX(ǒ>,ݵ𴛾é5Zٵ]z"]òƓVgzEY9[Nj_vZ :jJ2^b_ F w#X6Sho禮<u8.H#',c@V8 iRX &4ڻ8zݽ.7jhvQ:H0Np: qfՋ40oW&&ף \9ys8;ӷL:@۬˨vvn/sc}2N1DDa(kx.L(f"-Da +iP^]OrwY~fwA#ٔ!:*땽Zp!{g4څZtu\1!ѨW(7qZcpL)ύ-G~^rFD+"?_h)yh=x>5ܙQ~O_e琇HBzI7*-Oi* VšPȰһ8hBőa^mX%SHR Fp)$J7A3&ojp/68uK͌iΙINmq&} O L-\ n4f/uc:7k]4p8wWLeUc.)#/udoz$} _3V6UݎvxyRC%ƚq5Щ/ۅw* CVo-1딆~ZYfJ"ou1ϵ5E bQ2mOΏ+w_eaxxOq:ym\q!<'J[FJ,4N:=6. +;$v6"I7%#CLTLyi{+ɠ^^fRa6ܮIN ޖ:DMz'rx#~w7U6=S0+ň+[Miw(W6 ]6ȧyԋ4ԙ./_A9B_-Z\PM `iĸ&^Ut (6{\٢K 5XGU/m >6JXa5FA@ q}4BooRe&#c5t'B6Ni/~?aX9QR5'%9hb,dsPn2Y??N M<0YaXJ)?ѧ| ;&kEYhjo?BOy)O˧?GϧmI C6HJ{jc kkA ~u?u7<?gd iAe1YB siҷ,vm}S|z(N%Wг5=08`S*՟݃*־%NǸ*kb05 V8[l?W]^@G:{N-i bɵFWǙ*+Ss*iނL8G+mj(^>c/"ɭex^k$# $V :]ظ,9z%lOONRѦmDVmxюݏX}K6"Qi32\-V_kR(I-wtSJR^m{d a|y,F9$^@mdH֙toN1 < ҷBq/M6rN+LxE>^DݮEڬTk1+trǴ5RHİ{qJ\}X` >+%ni3+(0m8HЭ*zAep!*)jxG:Up~gfu#x~ .2ןGRLIۘT==!TlN3ӆv%#oV}N~ˊc,_,=COU C],Ϣa!L}sy}u\0U'&2ihbvz=.ӟk ez\ƚO; -%M>AzzGvݑT58ry\wW|~3Ԟ_f&OC"msht: rF<SYi&It1!ʐDN q$0Y&Hv]9Zq=N1/u&%].]y#z18m@n1YHR=53hHT( Q(e@-#!'^AK$wTg1!H$|HBTf̋ Y@Mwq[Fī h[W,Ê=j8&d ԋU.I{7_=%iG|xqBչ̋@1+^.r%V12, _&/j"2@+ wm 4\xNtˆ;1ditQyc,m+-!sFɸv'IJ-tH{ "KFnLRH+H6Er$igsϦ>QKwҰG%qn*WE^k1v3ڣjm7>ƽl' ,Τ9)%@ wl42iG.y3bBA{pR A ?IEY ?|-nz#}~f ‰dŷ=ɀ,m7VyIwGHέ 2tޞߛM{FL\#a s.3\}*=#uL#]  GE|FKi3&,ۓxmF͉lG$mN$!;ߑlUs>"tLvfkB|rN`)81 &ӭsēj\4iO,H̎<ߥ諵z/f]v2 0t[U;;+8&b=zwɓJ``FiQg9XʐoHKFϗ;gQZg܉?^_ XC.l.;oX]}:>3K0R|WD\hnZm֏op};ԫ^(fL}0/E>ƥN7OQ.8[ʔh,Rt:p<0-ʁקiߟt[A3)i>3Z i򩸉*ΏlA" &:1;O]-wgϊ)hn&i'v"/ͤqr@8!̴G~7u5/>HB)iYBAXKL =Z@ >lN%hwiiUsIA8Y&=*2 5I bHb3Lh!ޒh7YJt*CyJÄFKKùMt}.l^]El>NK|//f&!B {&g\,}F)L b߀My6Õw7[{Gqzfz3_X !xJ8T<2!)^_ďǂ.\-d)Kl1헐Z1WMʜ5$)M1Lʳsw5ǫR^v|t$VȖA+Lܑ,҂+sM/ѭy)_ÕNvc*@k]ן;trȫpeoxӻo?nfz6ؘҊ?b*bj^Tc?m%3-$h`EbDC;.j0X1dR? ^}Ծե4NI ܓR{Omu/~+^K9>lIxpI"wS S 'MV+Z:H2T6dQPMVx@'gaŖ_~;'zM3XcV+2xiʚFؘ%,20\Ity_Kd߇GUt\+n3X]Byoz)li$2cPs6D>TE-n# rve{椱I |p)U݋7yJw&PzDgi xs  xh\L r Ѥo Zt(I >|$3I}&ݢ6ɢ}{]x}_o>Mm8S]~(EX{S/{&Ά+4*Iqt~L4Ykja?BHHΛVu_$kcUh=Ɩ9bݛ&G&zhÓ5h0Va(7ΗBQG'CXt!*<~Ga>FGdǍ- ]O3]J gbXfX`cKo;0*$a6 +vɮD >:֓[@ QDe,]R1]T ZvƂcW+dˍ-7m4e0ϕ9Uyzg%pg/cco6RS`H T5%:zZsb?iJ'3X7(l+Cju 0u8j exU}W:Y#/7tLD4jF};qPU/rR;a+Ӧ8E8fmMs*7۝jlN$8jp:U>tL)N *<:1VArӍ?DT%l)= Κz]gQ)Bی:D1c+ (m0W4Q>@>lWN"^ X5Gݴv]pEO>NOI[31,j2 C:+34ำqE*\qb'YpuHT)|UkC.gr`˱Lϰ} xr.~l-ɩu_D5S1px 2h{5}sRmA>d2UؖvlX܇Bz1U_#XUA6u 4.ٹ^XڋXٝ:^Izq. ٽƎDjlJٹBc5S&ڝUyy >Vc11*Y0O*HƝKA`d;ɯww"O-]J"ȜH*D낍{gؗN^raͭ̍*tPM*9bJ_ ;3IVnۡ|< a:U`{⧘pvzz0V fNǖ9dɹt^dnJna) H v_K3TM ȩEasUNf4B~gu#QH! ,/噋cgLzP9&`|8|Ldl?6o A%"CS&]_<<7U_~z5Q{Q\"߬VwխiwSmLaF] ,Uy[sjX^ޗ_-/F\\UR7򱺹...m/~q[ /7n!7xB[)9nI [GۿsH\ow!>66}եl?|i ڴJnzI@5FϏaNs,RL-R`1ɲ}MV<вFzr?㐟,q~,b6)hRfbiRfH1_G9(ɟSYpŘ-ŦםG+qY1౬}xAX4xM"5XITd E$ZkNdS:֢̆ ?GЅ'JƖ'ZX tO֠U sVA6qFJc,Qh: ݢKNwE 4eaC0Z' O > 9VsWYʷf© RT65Ov%U`tƅ)  k /@աUeS@%Nb:#]' `t'NcΒV I.*F~L!Gf"Lf0OWv?"1] `Wa+7m4e逼p +{&g߷Pg%,IR Ř`QbmүcH&CLlvLҼé14jغgR 61SpH5S}t)eNqǪP@o`c/#r#v6*;WJ;[,)4\\=V~ׅbcK3;=,7}Ṙqvt'iI-|xRcE(8[ sXj~NF^)!F:ա+e/.ɔ0\lWoӊĭYcxNڽ^A12JQig7ȱHoD:OEUپOY>WK-uP0\8"M: /P:3`l' .Z cEpN9K19`ҽFpU]tLNCsyrFrcCbX%E+o*ƾtF*`NΛv;ogQb0M^IFic$"FhQ|![NIK q~,Jc%+8h&4II36V 8Vbv"wŏݙmn&O-^׵O;KaRˠ] ?Cֲa9H] lX9^vCο -vd+OU')X<seW);W= 2.Kfs%(DA M.1%]vat'IIc(a`e[X$"JX!8j0"t \5Ȕa|)v"Tqw?E8V 7z[v_J~C4>''Rc1-V RtzJ=sۄ`gv;̪ #`u0V<s)/=r=nlg9| RD1౱UR}UR,:lơz/<8"˓MlA2 KvP8ài샃˟"Ԁ}v8.6|`U>D*KZ=0-3wt^ڇg%1׭vM!:ɓCBc6ɮe-XӨ7Plp;I ]C^0l[hl"য*6 ny!袰{"./Ep]В|Ǚ8Hi7.cZ0ϕ!1Q%ƩJ4^3O{!5Z~wˌJ`.3Oaz.TMk9gk.T4G7! ^O7E"`W28  Pbn7?ϕ!l>g/OEU:غA>?=CۣP8B\pup6_3XqCXz=DH9:{o qcMו8`n{8c}wŷercl:WLg ӁvXǧc"Q`JBNcFSUȿ!6 \FW+[3= !YWX Z} "*1 [}plm_T60^1Yg~?lJ21EzL{vrQݷ㺾|%weY~(nExuQ m-8NJ\A څB}>Mh/NA8?_ξqĎ6xmPf_Sp2aMQAP*tLnIXA6n 8Ugݗ^s RTn|RKm;wgRCsT~Z*2@"G{yZI>UԞ = Ʃ%>HŋrwOt:7$ "i 8U 7bSem'wmrT 8iooY%R6Ot.N?CIv4 ںQ63p.$&+$1X$i,D)tɖ¤^C)JY UIE2ܕ[懪9 LCՅG$0T(8"/D7w M9P1o6,VG[mP&Be~j 'k)Pbi(tw; V3P&,nn$m=mŵ'7$}}OCgLzwzke5;MsQ}' #wGb|5]k_j-x*7δvBWo_xls}i9cK7<쵫ͣ>:jЂWXPZ3!cX=כGmE\bB!$<^KL{7o GfZI☿EH~}y cDz&Bt,?f+K][]x)2}vP!e28 c]?Z8ubI˹gz}+žS i{5bq=iJ2 KuN~/e^ 3ZB/Zˋ(2ل4J(VE4"m@uAGTd u|^>kߎnmLg#M["OY P9(<1SsQrZA||$etTK.Haʜu)c_-*[x4|4|=, IE˫rҌZW7/NFQVHU T2~t$N֤TU%0a`#VEH)q,X$[@ (/K xh3KGSZ+=9_P5= s|<:m~F@1x=:kI!@&yƫB% iZټ*?'<K5^"U*dohxU8( wEVӔ] TmVsǫ͋8PΠ>|> NycYuZ,`D:;n k _xѭ .NQ@33foj3wmk'H(dg(YAV&{ܒҀx˟\ӂOs+qTsQd1 8kcFf80ڜ|l=?Je,!\k{NDIw4s^@Z:A gʣXfc*8Z{NmJDghM6N\KT̚V\C,ԃYW0 *2w3AӋ:oLܞ~47|l_쾄j +nn]4]Z#pQljX81!9ېAVY=h~i(gUkPc/uvE:+z01(漮h*n3W8EA10KMeú!! ʦE6+>i,/zhc!'noqPw;|& ~|[ЀNN-4>%6Ա|DDӪ49MEg80/OnUx#;bPz+雳8vֲ r0=m9-8)9_\M^"?.(;e`aĭG+.qXv_ˏB޴_6>' :hyHa&y?V_r D@(0CL bۻ CoF]Q/Dr֖2pV"k}2H;@0]^Ya֕rs*j-jAǖ[n] ;E@zux8ƝmTMkĤT&q˽<(j5gZ4r#iX@/HnIV< xx ((yocDT5p*Dz]`eWKkb5 Z63KcH h Ã6[ D# - Mc( >!0mVI()ug_tXtHBYqwwe9dy]zSs閰JL )R"[rIX#Wscm ꍵ.0-4 3ew H<ن Qc^kaTj_e#3ʅXdIQNڍ5bKU˚%z92][8RܞqʢU*64?} 0ԳL(mN=E qqO ?}]ߊl{>}Q"Z u Y-lҲt({H mHc05=bHtC8>@݆,b6ת9y98–%8ޠ#]^|bDRUL|%< gVԽX[1{4=E@lI'6ӵaHx,P* V=@m),/%jk'Yy)a13 m)ь$>3T9_w--C  eEC>p9d0 Ω>z{ך6T4;s1\[c]r[1_w9㾶,}I("ہaNifD]XgePn\&S쮼 |izU0gr5+!i.[ה%V$\Q#0UU:W`3@*5+6,:ei|6"uqH:D1U.`9vPtɖtUiKKyU-XƫLꪇR2KY/>nIL[wp^q;KqYh-swqjS=,8oF&8]/L@ook%{kHqJ8r-`_q=zP_qucOgZ K\č6(E#M\ܓ'w݂kY{p52pmuoTO!<{Ѧ}40^5@5!."8?!@;ДЊB,=G v6a(ޞt$n/aG1A\=o'\J-!A4 9vKS@W+??CQG4&c/ ݗq ‚[,Ki3: N`#"٪jgB\} {ֿW\MU$@ wmܟF =(vc7m ٗNZCx((v/c.8,{Ϟx}6?< c4?Ҝ"71uDF كnijvwe9eQ7[zkw=78 ;.r0)jԸAy*Ɖd7dzΎ/>5eϱ16NnfTc|=OBgZƑ_Ajn- A`o<5qⵓ\IٜH8Wu$ɲ#Jl$I$|x6k0_=cd@@K</+wXMt+0L;l Ɵ"߽>ܷON_O_|pȿ\8K[&"s:BA(} z Bp&["o1-8`#k&;%D#h.tf8t:@Ύsu- @p oGXx7gCANQW V": ]o"k -ݚZŋSvF E=wv"\AWB‘ TA^Ct#=D nwcJ@qSxC ,DoCAP.҂e]F "0t=Cax |(z>a"مӇS_"bq;t@PxߑBv6Pn6|oj:uxw_,׿X_ocuuziPqBPɡZjv$yW1+g;0Ch.3nsύ)."5乐H!KBs/Be,7`f vE:W6BeǑcV!>1ܒidTb b0N(sZEJHڟ&t֮@: eBMq*o+ӁljmHmCPK~+Ui5(a&q ҕor3MMmѣv愺k'po!\l*cyӗAپ̷!mM JGT9<Rz9BΫ@uF@smC59|8zt\!Da P<;a$A\ W LDvzQ( J='ݗ46L89CڋlE0@Q'RnqS+U%ѧ8ag/'hP_'̋ImC0M Q~~!9Ⴑv*#'l`w/^rs paU0b` p/ŭLkY `HR~;l8W`D]~ ؎4fE&}{Ó%YG/90pD/zX&qu /)R=p8RSG&SUq2ɋ[d&G`^#\ΠGrN&nnO|yQ6ʃ;}FE ClI~LP/OU5rK݃wa[AE6)KGs>s =G^DL 8>aKu`eғM{8ǥ;Xy#YncbԆqΎxϋfPG1Р4/X&q:_ 6P+hG>Q I~ Z󠚺(ϡ> IZǃ4YKfYFQkK[r O pxX$0V55 kB9 , s$}>VEdhgRk*f0P<\HihzǙyzél4>;ܚ @dK]%3qnx]&̐ 3̰;*؛Tvâ`o?c=p (Yy\G 3m e `Ki"-L&ؽb!6)AE'C[6Rf]HR,fkS;^R;S!I=dR&wE0 ?XU7.e@CD !Pί^T4Cu%v`a& bh)}|!r m(n̥Yu+sB_%O 7`\Q3@$W8`PgyW̵Q_=!íz`Ac8. :&YgB^![]~#([m (\Pr7]{YZ{ae'DI uu6\Py:O@P,(@Py' 7]@PwsA >QPwAŲbA悊 *(@PoYPoA'DA 7\PyO@P,@Py' *K3C!\ѡ_E|g0B}(RѬL6g֞UrD*[/i "~Y^B?ymK-{w|cA{ `3KQXrv!¥ڇH*YrE2 c(4QXE᡻J[' ϻ/{lQPp{43_l> ʲ)@y2$E8!u Zr\z½ڈi.@g62|qg)֩( z&)aW s%?: n @CE%p1Agitg ;*@+m>{&T*TyU%<Vç= z!tjlNUIZNi5zszag)S$DF0{m.|ꜜq[VCzZcV:4.ߤF6Nh .7\8|u0vUUkoKg‚ly=pߜGcȤP+? Ck`psƗ&{!'i8ܿ}({eCT#ͺ Vu!P\HVNb" Ӿuņ,d(k ޙ%DqP,E%@vc4%ܪ|8ĠQmhTj+u .e"xV-;P`1z")oH!u/6l, <; /ScF+lL_oZR+.'S)6_(w*舐V:\5[In9  VU8oV$>Z1[/A#{HEy)4 #v<W*yFTVOY!؏! G'ߜ84Ɖzߥةf}|jv\kҜm1^5G5]bL}0!^n ﰾ2Kg~"a$6(T?̘޼jZjMk.W8PCM{j{~؎؃en ,pq_ ZsczwLG{jc]ƺX[.-‚#bQUDr]xXW*K2͵ƋſZQ#JoV YAu^tVE`$cl &^LvZ#YO[@x\kH٨lư[)hg eTp:#ZbtFQ;,0vنE$nInϣfDǖ]s|OX@&3}ERkCӚ2/D/ x-ó#ۜ·ý'A3KK@t^1iMnj[ 4P-Wg!P6UU'nՕdJHeoVniMK@Gp^jºUtR9x[@g^n[VT)mD޷YzܷZ$yf%ckH2z#[Y,T6RY C謸%hDqLA'l+C9}ek@.:[eo{ q0ߗbx`DDHU4J)멿#[Y:_Ƨ~;ʮXdI%wYל<`à:4X:{ˣJ X{_nXź*Is1T.R~wOy=gef@lnבK[>=L_I$>|t~f|TU4뿚ūCjV[޽s\)juY#bFxR]jZ-}Ju#>Z:jwNub__9MhM^'+jS̳8!mfͫD/y#?:.c{z?~cv|B<^] mB!7\5R@NܸĢ#_^.88j"B0(4FP% G:hR Bov]pTq\,Еr+rZdrG83-P-=Iac<@qłV7f T/` jF}ues^`Z>KЎP$4sAL;03:Մ;)fE;GY4ئ'0M)n4*Ӫ_q9y I()RbCtmKXb/pLUѐ6^a2E&Ox|}pj[Ayđg6i}R;Xj€r`81ksY8w_G;^FT ^L$2`璓cYi-1GS"gp1,gimR>(pKphM ۪ScKLbE(&iC}Z0҅~Q6.8845X7NjmR,#{p=7l 968DiA 75W xpK+cc: ` !瀧1|̲a5 4e}BkۼdŸ t'w`1bs,3Kx,*{el~pQ- V5/I܅^Obh7=ZS(ZWI wXhq)4F,8,]~5G(Dگ UjNYP< OɘEE;Xl1\J3 8`K!̹2 rQH>AG [<e-Le!Iba%(<9En#-Xi"_XܴR<3oa *14֐,YMČbcw肝2m?L%]ڈg%D֫v;7 F-g6(.y2֯cYēƫe~oh6p-}P }E[[{ӊnj-nsm-dER3󨯽?A5v]]sk9rw5H1&Uy(9GinιN16 `&Al14 ]fvVbTPX-7a \ZXq=;Jp"*0Σ9AQNTl"轢^T{9o&N7Dj& nIEβHJ \\$!aj[O[*G(fc͢=O^χt=lR1 U l IZ̴0f]Ow4B .pk]ē1Is%Xq*7wz+o iAB N}%kYM`Doz-{.hʱr}f) +*>5xc)M\᧫Eڬ 6aרO}<,H )bp?組AHbtvEt&(g a |9aR,0o`?dfK/Z 8\2QiD~YwWK&rJ%Iq ƭ)Ma ͝ٿuqpĔ` ma(ĬBVhcOԂ }ڭ^u2HPϤS(EYu#K9luPXV$1X\MraO*%Ӫ샤zkKemf$ۜˡM0 L\Tvoh+$ڵG; C[ "7L.ĤC >Q,]ਲ[VU`^LaC| ѡ)`{W#, &L[Zc۠B RS[q'db?(2x| H/T ^Q,]rE~]wQmEGeWu),PN.4܅<ɨ`$*::5j4'pJ/J}aL`5o]pR1lZI+]ʎAxo 1( pap&΍JVIDY>$nD.Xr(#$` XGrdTq 9ѳP, ~5"qPaMܢҔcxƅQR+LTr!T'4G0JʝB({9U.Acd4@JT|_0CI(e(˱Y׻.81>B?VI]' ?'%H|oa"FbPS NJry}&J} ck'̔ws)5*^$ÍI0%odp`9G_L2rY NCoTths7y-w&N~vR[PҌOJ`]pgFk}NCnُἽyŽDdGa.M&JM'aG^T2+mRٴjN 0+opAxz{G8]=kq~_(oYuQV 5IP(!B-%HRwP2EuD@(]vqt|x#7~b{#ŲFH̨w-1rYeBCy1}p")( װEcr~5u?8E5$ΒcJO_]ptcqB`]. BjhS<'*b钲 _VoΙ6CI1EiZτ(֟ / |}2FjO2i}K6t0"["bǗޥ.8 66 WvR6pfޜ;7h% X}p gMKP6/})*w//jI2_'/*ȞQGW:wxsXa\3ǔK='XdY9 C*tJr|Odr%!).Q~ήw>L<(GwO7ffe4.8eFKhx#\ݗodnWe 9'1 @wD<|tG#] Wn1k"&%-j/nw仝F[W8"e~&);3ufϧ]~]p4Ľ^n0Zg U[ vfnu8W 2 -OA;fOOdNUIBSd?.% :(dK\z#iw<"IvM}]c&6C_m(F{PȜu )p˙ jP,Ml鰆^o_׫}Mū&}[DۼP^^:+[^ppL.+j Ƌt۲<A`ykG`}2m4߇b [g5s^p-O6||^Cyn1_X #u|TfQ3O?N b)KTc7v{-}fˇS/Gj1Y2:,%p!HA\H.f/@=8umyP ܭ,3IO#.J5S]k镶ib>Jqy#˪3U< 7~{$6O^ۿFx*&W %V< POŚÏe Q,4gHsQtuv֟ g;(!EKLZI(TX_KtQRzD-H,Ѕ@#E;kOf.VBQx% b<-dp֪E4TrvpLKk,rm=RJ%(K'v`z}ܻםcB u;=nQ g(# ?vQETCL~%j=6_旺ڬ%_џ5j]v퓅OǏ~k ~3aJ{_)H&׶/ h{/080X\څh-xJJ[KDۘۖ/]xIvO4C.5v)ae]um0D'[ ^jal! FY;gŗ]DɊU%37'B0Վ> 2pQPwGY5ޠU@{"kN)etL'_KZhY1eߋ~?\64x3uDŽ1 8Ecod-|S)7?Dޛ(.W%8.0n:ș B^\kbtx`/ Zql?#r(W>4]x_/=n^,`_! 4cĜ. J%Cٛ%خծѿz<-fΦhU`=B] eu96],5w̚˰^.v-*~@߫ 8ZaӾ_X;hwWyo:r a>UpjiP|ߢ0oĽQIзAX"IͷkPEqfkAtduӺ g0 0⊃V]uwZbN㉋Fӵ,D:8F ?Sxw]p pwʾվr/ c?7&yC"9T(Tq 8 E^Q4ޜfL/ձ|i }2Vɾ2PP͆uMa|E\pBt:T2 U`d1;YϦR\0Hxǿ,IK`)i$!m(-kFRqJڰZwTs7>G#E %8-tRSpÆ&oT? pgD؂Qb XBL Zф#PɑIiH u& uݡ^sҰћ(i5)9tc_ib,<@'4{%B^ T/#{HXJ({W^UE9]$"Ze]dUt8-=~%mu¥XVCz;ҿAn3%qx7YKYZx0Q&,sպ+@>rક}[{\.=ogG`DiZJapάVJ~[W²w籅UO7lQ$c 5"De d`I U_ fkG첓-wϰsO9]T뀍Xα@;gtDa[A-p1t"xE>#DuxZrKȵ *dp,Ηj}gwAʞYϢWe4uCm>?@T UNh{rS*uP|ty:Q&7 wGUrVArNˮ1NѮ5;p+_Sky}K@.=jן|qcӻ(sPV/1[5jP@ڼ%9*klO(ZnRµԏeќJZP\;𷰞G0D_oٙoZ!4,i͖r xB"mߟJ)vvaGx0,~ҭ&7D%$T`\).: !xa2A4*)8yn ʘuZfmPyL|Q{DVw olضqľ{nbtz/t)%t9n ̍X>Єh6@z:z+pUp~hĈ'gS*=9?jLuaNa Nj=9E{~w~59^^61er 8 !Gk )VBQܘVLF >^n_M#$݇j]yf~py›ĪI9Vk E:GL@cD .h4GZ0$ -oOggSgz%>Ƞ1bMz8- A͠RO=Zn&ɤ7hDZ1JJ)iNG$X)X5=Hp] |mX.XGaRS yw$(%@.|۸blp&PRRrhZO MU8)x@c̓%dT}߉4U$)|uB>&$ɛNk=yBBir%y `mZ sR"*"~xw>(>A W>ga?~'/þjE|">|,G?3rvZn@.[D8={F P=iH_=%yݖjPOnof8Ek1cs?Pv-vu[";F-KfbBI\~-?X\XyCi9_bA.& B ȵΊ'`f?LM8|B#z+GXe%1Rsh銆ޛhӮX0 ֆ0qy@V2O/91)L˙+g§L> Q476P8 `8Yx֜RZ\Yn=K'h=1t [# "cD!"VPi.\jdt|C$KRs)QF8[' TI+R{4^9fҀ&b>J([7Y-)1cR,Ӂ )H2Ticg8ψLn1bԤ~$FC5P4-> e|~u\ߨ1^PȂCx i)H&L(8eR idZ$ S\J%@HJ 2[XY!tqs#F4MX$A`+Xd(Pcs {!Nw"Vp,Bh$G#ôNk! "=!$Tx=u\xDIPC$ysƞa*PJ:{0 #|nW"y G{i[x*XpXN`j!@UUJD tʂ6L bR\I '6ꆀrJ-sw(rTqC 8H3\7^?Jr0Qk"pRvJSS>Aп%;%/zހ p+^묵-AD8)klVag59>cq\yщO˙2. &ƛ\Jx*in(u@1`!*0qn)lﴽ{/6_DgY{8+}C!';kϰX#D TWP%]tWWWUWWCmZivzR$h2K8#BP™fT3KСH,P. xf'ֳ ۄTJqyXkhPZ3&k8? ݰsҊ=X.) 7XݎQ5ʷn>(*i&׮O`>ppvqӱ#ք0ȝ5iڲDTb~{Zp zռʼ) ;@w_I  Vv_gn L2z8irKK4IRNd,e AeF\"e:R p&Ns UFPFabT S'' S۬fJkų̀I"Hj9 HZe<՜X %If|Ls Ufo~VF6ZؾkyЂ|XԠ?$j5HFAQdddjϺXZʽݶSS};D oYbAQLá G=KZ&%& HcٓPnU@dd+5{fv _6TkBzH"7^ pNmmg{V3o YwܶfNS+_ B4N;aksYZʦЍ!^&FByWnT > &x8B5|z_r##7c>X".]P1Z3bǜb ID_yDA>8{x6_Z<a4rxQStتvݕ=-kx\_v<0kt=nOrѭJ&nA!Mk\h.t[ 2%Z^dCj/Wu5U7?([/QN|I*AmEϣ(0ǨBG0n)5'U?z~Tt~,Hl; M8o@#:FxDP13V hpx j08#P7_]42nG8J?NcTssFܰ>d[.h^+05|'vsHuoZhi$ %{|>8 Ema{1_ha!"\LeyR#@ؙ !c. F /?8N4]lRF G] p?E1H1Vc2Mr[rU"abmiiS )d4J.MbKΠ9,k-BMy2Mk80nBuVMnUaKDSIx#)6O:y4+ocȡ;p6t`|ą 0K1Aty0 σ^qpô01XI*+3oaVtt]Ŵ V}8ҪdVaFxf1֖nɅHI4 kX]T;#M5sT$Ehba%`3lt'7[]eDPyi|5ݬD`ʮ{s\2,%a꿮 O0f<lFkܳm}h5ѝ&x פa-H)@'GV`rm^ 8aA\gϩ5chѶS+wwvkP͖9Xhbl﵏abM͡4*|N]V4$R.5Ԗ݊^]Ǐ]F/|0S |тe<$+%;Hߗa((-i܇t<_U"%ٚN޾Г8L Jv{Ѝ f.Z٥ X4dj ~G͸DPs%hݿ%X^h `ݫVxw{E?tYhUI[ykergCo3,zAk#z8>/悑eY\؅96蚞CxGhڇ}@r9Hg#V`.lK[Hl޹T0aՑ_3"-4s+gj>佈 ,O04 j8DC/#p= T#U~T񵫗kt1Ԗo@-53aQ0lvW0,ڐ7PSW‚twU DŘ 1F5USኪFf ܁T^l|3*:L.ϫڑX_Y.Yz?g F+Bgv{_AX"Eпix=_Ó)znZ0񉞳b8sA(M,iRϲ; Ex*X |IvB#c!DHnAt>A.tb`J(!D"t2Z¯ei \-l0Oޏ<k`@?^e`+ҲxG߀:Mq _.k}e>~ـ;Ma.ɇ.$O$Vl-Y,`3M y_|TwuRރcG;-\&]Zn^d0weq'oR9UoJ1|:. 8y *h.OF$^o< UKӠTs"6 TKq1BIZ-EkdEք4*B_,XWJhs"*QXei„НV6}_+NX̜02" EXY8,tj45߭,lq+<1H; .2"F~E@^Rn71r ϺA7Ɛ/ZTIK. yfn#U¿{:϶wfS}G͓)<>lvwR6C{_9MG?h(!R6Vs>eNC۽K헣?|Ro?O&!i M'T&l)eiz=/z5!ROQ?_1 Xſ>O~7,z0Ny#{{[JwY@TjuZ !+ ?~zB孇Y {WGԘGRw܇E'O6fT%f]SP4: 1~_& ݪ)τFYm-V|0LKw@"NOEI@ IF &w'=̞zoz (6,UE\$4Jb\8MooGg>p&OK&^(&U l/_e^gW~QH DlpV0dݩaE2?2EGQl4XSУh\Gù+e7|pTF0W(/ rmÒƣw"o _ܡ)w$uHWa)L$O |^>(ƣMPC\ fMήF##ZMrը*5w|*@>?2|;LibO{4YNө)Wt&~.:A2\9 Wo^ߟ^󛳗?QfNӛ 00sT "r; 8yКnkhPf.ƸFns+ƽ>7(nZ44x7ry b׬Q\Ҡy!u@^~ $E}fUrzbiPD”a~(_e2MS#Jhة3Û24FgU\j)ɜv6e0UYDZ~?<)?- k<(P_>.Z5m0yoUs O+H0a LJy~]~Ya#O)>Q}]]%Ma0EFEddN&i} \V;d@8I9N--re.r,{.ߋ[;>:Ezk˚>g3k| 8G}HǷѩ;Š)3?@\F+$:, Nxy!?UolX"`AI'i٩RTSl c?8#E_9VZ;uv#}HrfӇθa?H8o-*$hUƻ 6?{_R~zۚ`tr"d2=c Q;#gåsb9i/GU_lK ̂^3 :2e]ȳ;ijHQbur*0x:Sv; fܥ}q]n\ш Wɡ3Uvx)pνżooDϋO`Q඿N;hQx)3J:?66.˷CIޗXA:[ ,o?iJNcz\5/wk\8݊ >ƭQToO} C@v"%WaP`jPMYc`P EIXe/H(7_٩4#om5t W.%_RS&XJ[=\s_() ^d>DjvBӓChhd(\]jbkQXU[Q 9M}͗J$9rj:ocb;~!k+i%aiGV#_JU./H N*ݮ]m/+0YcF˸5Y\fg ,w!mC+GbwwOCb{jIv5 \;43ZkBjB0 0fPMÅ||'F9P1yeTа,K2eRS,"Y(J8g(iBB+@A)6i0UpcuL;T׆mM䅰3kEb*%٦îdi­DzHZgБfmEK|R |DR+sE2]dRid9km,<F*|d/[u~Ϋ:62S_Y]B{7rJUng֎8׍sh\4x.< Es \4x.S<Ɠf0>{6(EF[ f 323\x.uQ"S>J˥BYS)*LSdB*3ɃdNVx=R0:V!⦪h (xx/@kYW̓u:lȂ VfZe v(+/HAMHǞtg 5cs-hҵh]GNFRE/u7߭7)Ͽ*\bRڮdțUݭ<_d}dUb仛yq@~Y yʹ|qUXԊh&4`:Hi'EF rl7 j^?^pF &U]1n+!Ƿ4(DT '\\PeAoi`.s5zr"hcWN^5VSv!$`1xN ,)@THK[b5#UFHHU#T5'<*GF|`K\I՘ VWBHaf*mHpq DrT ;1Ɩ>,Ǒd9vս}_ oPKm't$$ *oS[=DӦCnZohVy\B15<"dXArE8Vr0n~|K6zi!9W2Ƹ閯v Ռ-2w7L"%&߁ AGXʅtQo>/t؁gv6&|u6E1m-1lƑd3oع4sL$,$[RqI|{/';ݾؾ-wl/opKTW 9,UKO>ua5#TæPLm0${XFM2 ( 0,?/=yz9@â 颐,1uD ,60<hK:Cme!yO.GKPoPq1.?~9sː7]xą*7IBd 4Vc:Cdw,7!Pk%8Y,qp40#N1,`9/p|O:f1ʊX-L xОiL 4i_~k"5Y"Z; :\ܒpl'l!H5%)%Pbe*Dem)|~R}ѺO{Yќs|7 [8v(⤣1j@:زG8,h8T/Rf_O!6S(f2޾}$/I*D$@I8\u4j0GޓMM/ PuVof4#xVqa| Lh-Euekr$c}? yON߯+ߞd$QNSvx}\eS!a}_fVN,x*꽞>渖ڑ4؍P-q߿i؟uk6n}uyV_ؑ|ZvDD_(6[ E%Gn,9]ܩSzmdC-SqLMhtIp%R;EN,fZ(.O5q\rFSI:ؤوw-:q蚃7BČW Eտ4IW :%XlKhn7{g6;FFkd5 zqMcij%H3Z-=qZg1 jӶӎۻ3D/ CܩsWQ;hĻ8FITC7 $Ӑ5GC1!rM"b)br=`HH*xvw6_E%B%鈞h-\Sx jzWB-xb [5vEb_뱯\#3OXFz#{1X\O**hf}emcC 1RWmg[y(IvWq7ρ# r#B8Qǭv Wz<)٣"x9wb&p%xBtN&3F-?M#BogAz#Kx,QXPUNة8n0Zk!N4"QwjsJS{SȀjTz̍iZGϧiUZk5sm{ ^WLstX_ [$dJ;1x@I2D͛e(,n<_Fȴ& N(!a)$:u}Z7//,!-g%X__2+"fton^>eňEIP foyD[L)_\!,*Cy3F]G/5\Qtm*>ElpL@ғ{~S<U)T-WFQ}Y.FLAQ╫u#6 /[9V=&FPFg,)'H4ds1Ky` CWp,_=HC'J/5$yW+gIJj0u DO0(g|J=*$?WU$EkxXZo oٌi`3/66//FfN9T>q5*`7aT^~- #7EMKEr(,`I,f/ZLJ(Lyuq| 6(S6WHLĩb+@¿S;bFtڌ JDA+e/?/wwϞPS} W}w¸A7^ }Ƴ&|$׏.8*1Ƅa;)Gӹz, 5FuJGirK_('v!Txtp O4H=4d}|rd D@(uz {U "$ +4j(LU>)×'Xᕥ U ҁ+"!.hhvB>7cԥ}4K|[e QlԀ/V=?qڐ)쓮TBs! i`&MM@ Eޓzcߓ *SM0Mi5sHk8"TkV1* ~-"V3$U "R1u`yz䄠Jz2ϝ1v_;~9;^)?o`Z$ `YN'BuIA <IzӲ%'hԌ2Վ)jkYHIRT6B&Hu"c > (^,xwAϿ ޤaƫSPػi\.d#H2)&SHjeےI#F}vz ZQרއ +k5( @ Ac -5N@L@xL)92Qmu'I^0uWHqRZTT#"c.Iw:Qxh. ).s&s. u%hRw< Ӻ Q}٬(/ 4U|m,lw'4 =qL/\LÅVZ^)OUeNWueLd_ hڠrۘ)ڠOmܵcy}/ul!~!Mf|<{J`oB. +r0V5*6=`Wa˧cFx t˧mBLBme(Tʝ_;ZJ}1˫pU4,nZ$O-TĿc^q9%Ӎ:o7;*>,wW/K\8xWNpLxq_#7SQ}Y?ٻuWg Vk0ymzsE3M7_lcf-҈xK~nW?S@׬ X:~X9xbXi5f3Zh@0cA0A#ENjB]ŀGۆD>V:LЁHG%.d$>AsC[Cd1}/!%%`-*TR+FgZ6<-|h#xΦ/j3(zqѱ1Wui;_ 9|8bA,&Hp|!.iŽU]hrcs@cE1\Տ}fZ6AʴL1ʢM`v|yD2-KDD)]0F E\}șwXx%ZvZ6|P ⑗JU2`T! BY\ !϶eXJ1ww؋feE] r&H-&D{VͯVS7d l l D>VL bY:s+ [z2!D>RڒD3P]Fe`DF8,=TX=_iFb#ܥ«@ A3NJ$:dgS@9mb0'l ɬe@RHaa+ V鎡?i9TG53GG#,;#ATs4* |#c%N w@U1nhz3<9?TSF;GkЄ!?Vᘂ`{qLCbΨ]vSM]UUZXZr!8v~3*!6286ݳ+%,X#.J"vڼnLҎn)’ ,T#6J#3v]3;C;JyV=A:P$DNw9,ãӀ1vD>R dzyl}-_g3;^\+L2 .rjr\ {ݏ0)C6ʖ= Oe "+[ZELz%7SG5~x7 ay]M8ϳvrZqJ 5f0 tmؿz䮔L{N4/QYކ4 Ғ8|ĩHQ˧w;_o1"*.jXسlw |Z,t܌+%*G(08( iK֥xq>w $+K9ʙt1cۀSa/%CL˰hR( .DŽ l:qaUA!<]䲴ՀJ0ᑱ1\\nO2fDH/ 6_ ~1zƗaYuM<ھ.K+;$ AHE0*׉0ZE(78yYՀ1A5|ȥTчʒaEĺd^gz I)=6DR#yH{@d$YyZa~/#q.)XgN`sE}?֬Zא9_$-ZHƃU֊yf¾Vk4<d*jsW8xB>V<8| ӢP{ 3 '|aLЍhqhD>Vef”9^wFQ B<W_[!|PHڪzri$-,C1Yg*QYX`BA='~;έ0۷ ެx[/TؽK/ 0ks 4H[ Jj?QuW;Yz~qedăDa7|d:CoyjfrVy$% 9>ګS 75>u W/,PK} !o_s{A4) 4N@k(N_sIa`rr;_\Zk"bk\Sg#hZ&#ݳ!7gU 9@e RQR3O-T: 8=PSyPG81H>]#\к;L:7_-ݎ"nkEbn{LAKtC/ }Po\V^~A{G? 0W,;RBZKĚbOkl$!3-gv}4]o JV}Y{uxHiW<8!a&7S(P \gVҲdF[ Lq< m.)k }ڻ|GXH'5h{o90  ~zA?;c{!h:xω{tãi\ݸD>VT*Ta}}>ЋV+PS!8:8]{?5DÂd#fFlO{X]Dĵ?{X3tB91yAfzV("쭞f93S9fr|2JJNi=<2&Ha'OpWF:|IHpUVVX"{Moy q"W!It0WƠt=JP2YIx SQ qs>ig#V*{<`zVd]U,2_4y4Ā1ڰ q"׸EDWS7j y6v#bmNJXd4D0ʡ0a ׀̣Ұ9Sz;z~q_cmf#<~b.. yY$6{G'?41MD<^ lBء͗ubD% x-EQi5:Ļ.|TU j%J pڴQA5s͑%RG %X{GD"gƥaSij1 fcwtXozf7OHKy$7̆5M4%A )iOz_\#.%_[C-iZmPx깹j^čNPzס ae?@K&x[W 03V!mmʠR0}ι W%ei>>0E:JgVۯ[Ne@ zx u~sZf,)N5s ˖1, )+(|}h*D֦rmb8WX0 prS3~㴤bi_qF}Q"|}BD 5w}/mIŸ'z[}h(Wr2+LGiFXm>?)pb `z:Fᤴ+L'}B)^ޜk4qbr6tUY}s+ 0K<˧y@&t_"J*")e6nڗ/ޱ:9ZJv"&}6)e'Q쐖!j 3{~z>ʂj9Nz$y |L!w5\eVݚ8 [UNzgiP^ |NFokN-ߦl¯s\6&t$)OE|&盹G/b: 7'`Jr,;>ކt'ݠEE#htBg%pr{3UB!bP[i]$NrO2>IZϗŗNYqsQ˅mf ]~cxD&lf0JF׬վ2~S "< M`  !^A"aK eQdH0炽tw6kB nF}hTzࢎACgЋtS7X?~Ľ*:_!V7AgVf-5-) BRR[*Wjň_Z牂-b1 !k`̈́g d~p2+$GeUbedՏ0 gُLZ^T#B*S\QD/ 4fAe*jc~q[ )bֈ,3B-5xUЄ"!v_ޚ֖"[xdmqHʞ"u6V1Թ߽#{AqhcԿ{AFm"]zw2ѮCGi`kHhm=faUAwmv' {tlw;-w#$D&(ʹpcO5>hQ1TxF3} LF4 5R, 9wuw~R,1COxFCWR (H.ɑfn CK47f6۾Zfh'wP;(qYi$Mg{5[Qȉڽ4H谂!jYm鵥TU1:_fN Wk]{Ax5i%kQLb"4rx!AxOhPo S#S#qڎ99A߾W/~n?,p-I*!rAݸY<[- D%@R ظҥ>rRE¬{+clzc]}<`ch9~U; x6a8R/u괭au Qagmv[ΐ;`no öEgۙ-(J8ǒ{(ٖW ):!ϋy@<b)pp|DHEE^/s9) <,h9Sw`֞c<^O<>8"Q,$M9K1 FCp5dx/ k+BJ;3+L6nzδK~bƺ34pgU]0kUzQ<6ԡEHiK@StI|\#:^xǏM,]d1*?T_o[8%4/q`5 ُzMos2J%Akho}!iic|^`:u1Փ߽3r'6]]TK21Tdp̱g^~3k p ^"/*̘.|2L~X&;ughwdTn>,p5ŸH&:|2"ܻ*):M<6t_hr$[kr3ʹPŽ7?[7q=>V. 8ϪVYao> 1L} 3n=-j__"Zn\/ђ-ɸ]r=_V/~sT}c>wnv,ݮ;gׯxЛ~^ĊYۨv< A>??^\PWykA7e;q2etr7ueӛP0oZkz>@0vcQVC\&?Fu=.߾e~CZ0հ L;F0P{yR1e j_`I "W?9WL@R2‰.RH&="/@SOt0Aoc-IWd ) qYhwu_(f\/EWу!gxQK6mJ3mW `m1ܼ<^K~5ג6Z4*喢 ĐFȯ-QMf(?Tsi,:6F*Ȃ $YRʄٮ<̮oa rr:'i pMQ6YD` i BIng! +e)#)MMyH qd:gMThĨ6T--`!u vHQz~a-*w]c9Nc4C)pIr|V i Q鉳Zo{CCO>)h ߼{-Ґ1!?/DaL{NP*\jlYd>F_5Z_w1X;!t3oȵrCY s\DBX|v=k^JAt0hk1iU W:u}Í˂R)ـyU"滑PKm6< C}Bq)S.J ]m'[Uxk'i a'OA*DXoad18 ZoH6V)OOA#Up ŋt}: tR5*IOpie^BR!L&T'2ogeL{ɨJ  !ZfӐ%q4<l..PwMDGY?:lg]AZo$&E0zawF@WRsC8y M3m"B .(FE'[ܗ H|%QЫ `0hfl3ZihŲRk"3h(ۙi؟Id^6Zpܺ0b0`l*E+ŞgeǪ!2i-eS-eID鲂3G3`d֙,猍Tms2u/BGWbo3LWe9_^76BFIgZ,jA2BLfEX)ĸlWQn|]4H"Xl! @G,wOcZ+kfv2z8cL3QMm9]i qn^Yae+>|m帪P1p%Xd‡RYiAn]~=ݏ> 2] l=3hDlU9>gl4XPW`ؓiT< "|,⫦مDKnSn(Fp3!U2 +RDn4N#+ 7dWKߙ}cfTinX@0z[_ I ve@wܨ qq"$x#3^EQW"oOb9[i/0ol16}p .[+l!n:k*. N 3BP`!>_Pz_^7PW/kePH=c%'ᕒԐ㞳N c+BG=!rz)g5<؃ܔJyXiI\9֬(=ZUSAUSc#C*MyIUSʜ&6\L]|d}b$_H 0: &\ƋBl{]P H}; _i-#v6%vo6y ٛ 31킏=*GMcK>i/t|ըq7ΐoBP#$z܍*ov%5Ȉ%*:CKE74ά 0$?ɵM5RkSj頮 (,)<%Ϩe55|MԽ~?$m^eRGw~VSnnP*Ǡ~ɴ]A4VϷޥs#BݜQ=zLpWE,$ja#`48; &ԓL\l{=6 )*b3GE^p\~/:\~ݤxʊ;'s%nNo|YTXCX(X JÝ3]3F.祟Vf3uv1&?`J qfw_pwL'eY BLRWCrfnvwsoycOXqHʼn}(L Cؙ4-Ha1Tpi!ZF:I8S ;}^8/qU*0~𲂢g&|s?ss0Ij(tʑ-(haJ+^nd&_ ɯoNNO/;/3h`I-)Uƻ(IJueBhg !NQ11rGaP͔Aw.p%SuIz4!4e;ÖfTFre%)҅hW$N)-EQ6diy.f~ק]i7(vT]mU C7; RH CXe#a0: jW/n]WZ©u! <2\y['H$M|+4su~B⏏C `d1*?T+Mَ܍ߍ-Ly*5^QkU?N'~n<} ̀vއ ϧf;4vUa&U>M߻q-- pk~T_+7C,$tOkXM';H^S'\ gJb7@cw,odtTeK'-X?;o~ MRm|Nh{a s ȇyU:}CXyd>iL.DpCL%Nǎh6lާO?nEZ4=d L#,<`Sq4-xr7]ο߄ R'0kEԆ4q7aUd.INp/.v לIV>vXTo3kY1mg!\J4)=YYꃶ/ӓ뙮|mO57^! Wary-cܛnruXjauW|tcu Ք9&և$2p~WK=/Fo_?nDzoG;8kdWw _?G~XEOƳsz>3j I^'f~w`*?yМI=H;:t4ֈH< b ,+L d %yӻ\-@]>>Iptb:bE5c;B1i`!m ۽R{=2hlS9&5;zAϦ4g9þJaJ:ļ6ȚFWѝ}l9Ng,h:d-чsޅTT%+ ~o]]?^zz8uu= JWTEWI<ys=ER :FdmrW *3mW?7޶oهyRŊv8tn-p3#Fcs$rvF|JTTJǴab匸Y_]CT@P gVi(<.-6'F,?R I^i&.`O]9#&j@!k=Eβzze<>3,Ў%/WvO|s!C8lj$ee;φ;|gP^ E-Rue(T1s{reaRymY+-Eɞٹ-=awm~?Q):_ C^ʐWĤ2dӻ9*炉X1q^ejH Eb 푖!R 3_d!zI00HVr2+&˼2-=O;ΛPZu,%U<6@Oi1愅htLԟtIj`Jbf=S([V'_o$׼~g~0Z r .2t=X‡)>Wݿ"oY"VT`WU`UJQ&Wߩe&(Ih\\U)/k6$>X-4~BWVIڽ(,xbVFX<1xYV cV˽t{}ʺupN _sGQh>1Q%}uXK6`xQ$+'Pr!F Tc>:@Ü_ޡ8{#?O5=qAܟ$VUj꣧ k_igg[U5K-/Cw2/aN Dʂi "Uv'[x2Geʈ\ 3ԟ'aɫx)KBdcI.ҰR/97/o^R/{~oK/-z LtƮb+ j Y1[,} *V j{ Q4V~vO?nEc;*J*)h<"ۻ8 /'/~˟hUuvo/.W|˟~|.lyc_!oWU}FD"WR$ )l|CQS8|@ɶˋoX\އsMfEg7 6f5[ ݭ酹4ZNkiXކѴp 3^)% kagBvKlzK~3=F|۵D70y0{S씦}3N+B) X0=C}3bJ`d:2a0:[mPGï8S^Ν1[M▥@\k+{Pjy LqQ Ud(NcʠޔJ=X߼>+} `rye*V)j|3#,RTk%iol?ا|$UO_+=ZRB4_9gH7M&h=qڕ3v}j&v>[P(Pʍ_Emr V*Ef#kZ )dE[[SV՗F 7Ͼ݀>>\ ZF)\E`I׺Y[vTSRm_šL'd~4XZrv.:1볺O缀C/Ƞ["VCVGHgHJxRTb4Gn*3k1{z.?$H~ 'zyW'ñ/4~_8O 01@b.H䰦QF(KyCd8/oXF>X_t99ZhNݼ.w+;Vm8-E0B*) l _ ie 8j{c4uuԇ( K5R4 9t=} R@S0hFpp``tF1 >ͽӽ,,稍{k\P1JuJZPyGmDCzPulg(rka>}\'Kkm ZvIU7mKBM q>}ѪCԤoMFQ MY>8eH( {L0&0Fapzz @യ =98 q܏o,da E%#D0"tj;\Q ZhG I!!*ʵ'W'1c!H>˸kQCx@ltڬ U0Ha p`Ai7Sl*ECF˾.uVSK HYN+FHƸA U ttvfIۃ Å܍MyN\ECh!CF&2%AC=6|ATd8e1=wI̢ t-nH6>CMa >2fP q/>.*bZ!IcN5A-ha(FapB}/_1)S`gW"HEPZ>íRQjJH4=,i *KG{EB6º3 ˹I;?ߧ#Pvx I]*?[x&<⤢]l%80[ W\T2W M`UDN`ZNoC~ʍ8>~a\&k]{2kw$k!F$iQCl۪_vȤɊhQCdhbp:NJI+NjZG8%F}xެۼay!;  (h5򹆩0Qd+NPjJۂ9eLC/FQpx aVfNV<@ {H1*!=/4ZhT˲a+,o85z"`x0-IחߐaMú0wKH`Tp4k*a`€P4{xՄ*|ӯ`q*}ӻG j$xX/ G "IQɺkQj4 > v=U,?]/CՌ%{'pvptPߙ_D+h+<b{`FapzUMK3וn3X^N"1SR: HyP, Rn<\"A0T*3;=eme#_83B{4E} I u/^adY-4 CeOE[LEZgEk 'Uf7U23:X] @ @MVmt/?'+NFa|zPK\>4G-Β䔥hh]V[Aq(mqm\>hfǟJc8Rɽ–AVU(LOK2'q~ȡR=iax#<,N/f=!km0.R'gF@poIBmr0֍D)- )QSmюN|393<T:V^#LJ4z6tOno\PCp~o~64۟Sd ,wc)8t[ ƽlbj-\RWF/Hn3ƝDn6{r˜UoNxDY}P^Y mᐒ1W+ dr?7gU$XoEgԽ]y=_F߬n(_ʌp+?bկnǫu)?O/NۋW^(N'\<+>|ȡd=-H[4U~#}g~2~ۖhWΜN'|^RI^W'r&ѷzT07ԪT$H8|^-TB4EI+-Qh׍,s[\ /FvF0G.?1P _L P% sd)0pRjeJ)wu+P+άBm7f_9^(TGes!r)֕ 2/kEerJfJ$sZz-r/,_z:?h~]nJ_H]N@b 5%C C_L3SX6tZUT}g}ϼlX1|,*468̜fFrZ+ 8*XqJECw,5qډP:@,PÅ،.lXm%*5D$-3LH"JT-1,qXJ8 ʽ hu(Te˃I[kSeL`G6Tg3Pu%N. _L\/"EnFeY)2J Ta$w‚TSoH>jcE*^9XcF`g$aVS<@(e9Xk3P1f붸CмjRb6̇jd0zxY-ڸ?z쿟hbr~?hK#R>QNۓ_{.A>7/8 NN>*zS~_kY%=Se/-6sNQ`T hO˧\2Kulyt54FƌaϪj=1(?,ߟՃ;rо*.ƚoihMc5ֺ|R'A?~䣋@:ΩHյ:}}e42j,fn#bdf|ыE,f{8, BſduU0nGf8=q ѻϾ{|/^}0NqwOao)ۀ}MrՄ_j?f .UW-pWUC}aHT-r[+y]S%atCKgvh%T JF?,>?aa<Y=6/wV%:I N+ϙh vYQ8K) )b"ɴbIf83dp:ڰyq>Ljr{;/~NJp"{$=vWOmpz;˲s[^sWڄ*f)ɈJ9n@a`m"HFwFB$]aW]#wp:@jfǿ][~u1[;fNm좯HuwbZrPZm)'a7{h4⪓A"(*sE` Z1]5]Ճh|)"`'u< C?LQf58o9=DR#DED)M\"::ՀPw#I82ƇRhJ1)3Rgx"g:oTT\wսmo:+h;~o~כSzH,׌0l4P /go Xv: ܺM.kTd>Ƭ&kё~g սXy" Fj PM"FA5:HqFaA%N8D FDoNq@j+ ўEj]! 5,XFc-.C[|ԉ(p8% Dg%' %fXڑHm%`ChɽChy3 gr^l_~x| ~]oj?^!lz}M~Rl^5|-D5`5b8ܚ+y MQ;bշS](x*.O24u<ϱI8#|{6VHI4w0̓K8\' j;{nO Z6NTul,[8ʥ4 fCA5yw潛ǟ"]|Cӄ ROTU+m:0&y\g~_@Lh0bP3b.g$K"h~ǒSq"9&Ʊ,A*NypFRdE ]CBځÕtrc5I"QCMeցNg,V2p©RR( ,q?MzvW-\M=}-myXkFb +{蹎]zkW.'dQ1kCveN2%qC)VRJ4"hJ vcݹqafpjl?sn~FDeWnaM8L0Lq tn4B@(EC6'Dh |O4>'Dh |O4>'Dh |O4>'Dhx'Dh |O4 |O4hJ \O4>'>'u0Β'3>'>'D}jTDR |O4>'Dj>'LEKWC58,>:|O4>j#)zJ0ITܦttڼ:X2]v:$[vH0NVI^~]tFgahEu^jտJ]RB/&N@4Nhc8VjDDoDžlUFS K<μG>_ʾ^*̒D &)<-xB әҴlڋg֏3zt,CLw`aLoAc5W܊󑴼Gg4K<@tzoÄo3|iŻw܀-jjWwGpu/* 3pVXmC[;pRZ-+o zܬ \ LtEdFCTggeP9a 'f%_xUhq2эsE2*z4j^D$@"o>N ji=Gu;Au녴칢;=n7˔fwXm3۔Ցͤ|ޤfrCmFt&syH2 B^yi'^䔬%vdJwɼ]D&ylt.o}ӎ; ]JoX}ф)c;0M[.Ŗ!ڬXռu73`ەULqjZ-u񦑑A[WFέcƝ[@]jѦ,' @Ra֑h61'MhSIi>V#r_U> aN^^d=}$xo,wVY~dt,>26 vG4] 7&~Rq $iMt1ҫcШfR\⽭ֲvb>L1=,Q7ii 16jAJe ~qL$n:׺ܼKIj`U U5VrjV@XUjXUj`U U5VXUj`U U5VXUj`U UA_Ҡ"8 5>sJ>OBzx^9ò .Eտ3HƄb@_8P"ж@h[mkm zmkm h[dm 5ж3s5жd;YnMxO83JԜr5w@j[⏀5ݿJ]c{:nuSeFzULIc,fX+M`pNqz:7Yߘ6e7N|3C1Y~ltfc8 u,>bҸsr/w[IT7Jgm}y` $n3͡WM]y(T0YMtmN!jM}7!ֈA\q!*`b8uRi8 I`z]p)I ]9vzEHґK Wko4[7|T\f8;Z6YBB!nYI~{Fܿ Q"{r, XVt}S=$%R"9#)QdYjSYCfVp\_}.ӓ7FT wrYXM桾'ZL1~0rp[Zk%W{Zë%zYV{qYz蓓n=cϟ/.g-~Zn] u"vݦMP7LSQsz_wy0ccgq=&wmm{~ē\?yLwbw=~M/}qCvDE3:5|8sui5=p~?_/ vv3rNdiLoH@m9цV:;Iʾ\&kaenU_}??|'w: Amы7?\~t_+A_n[|_~vvJ~6@'%?Otr~';MlA{ZE KiE ̚xcQf*۷5o4$N[-/yH9H>ooD &]̃:W:ۨȤ{?׍z⨄?~[o4YW!FχɽL-}cBw mV:w߀Hę*O@.N;d'/5҄xOh2fݵrukۙ{o4|剚S+_ԥi< YkU۟y4:R(=w88Q3 %?YōS *x%F B5?_ŀ>sz2J] d%q]O#[~nC/uJNudq.a' bydO\?^Ogsm˃HzNϓ:oٛ?rv_ߥ?~= w6ܮ4\4aloWz0zV-Wﻟ+.>xί̅*%dVhA'2vRLHe(R._U7sxqBl} e5x ³laX@QtqR9~XL |.b)zo sMI>w´Аƹ07lR&i!TLP% fSІgGPhjKx[DkydYEy4j -;cZ:)x"~&HiK!5M?Bu#qk $X)0 ؞~&q4Vm58KA#bOՕ%椷p2N,!sARfǗ_0CEd,Rζ翶oפּ9qOb !U2h9zeq}^XrɢGUb=&g #7Ԣ5) 1 ^RrMg ^Lؚ x%q!QS~^f~af3ip&l;<D/afAیsc#f-e7] JLw% 3 Aʔ΀xR:CQu 0Fx2$!Co T.B۠fDg1h2N"m,+͆/+[LQRGRJXdBF|q:KH>G+0#8'#vKK&I!c¡t-m$kɱ{4d F%Mm>n ],u+)Ȝ|D qc VC ,MTYIQ@)Zh̜RQytwp9r?@;XU{wVY{Ґ~+w‘['tk6kV$SOR :']2"JD&ABzӓBE,TZ:<{ F:1`%%|wnC([H6/HJ1s zqN@xhTk>/a2e^K4\M!K@ xX)9Xn؎ MCn!(ykk)\ QB[^W%?~ ;̎ wJ-{xg*wWvA'nmЫv[9ɻ7>gk g u*=PJL<9n0Z@gݥSȊ!ZN/nnΰ&2HQ- Rp[/'3B[zdqG`(d`ɛ`\M4(4=0XK+gR_}LUL麡" U;B(E:w,+Ub!fd^kROw&2s~qu7E&6ɲ S3-4SvFuKX9\߷ E\x+p%^e/E |.q>>:wiG7XwMe3yiyww- 6O䏿b+t" ۏhI1:,nYk! zJc]cGWאTR9,@HWvtPh*Sܚ~vGG Gt }PhD(Rdzoj1"~O'?Bq ->\_~w'!{m;e Tfږl GL I`q̪`_f7CUMwEgzٲu4Igi/V1V *'MlĬDT䞖C(onުQ̫{@H8 N{t~ -;h6!pk_q g˜3)e.{rC(4l^nM,;YI(x67Nb&$K&xf{bZZȨ㴞?B NGLȤLCϤJ׾E> -#׭hXLTL1(43ݞ i Q)-\ !D}&B{% `A,bZRVzdJhѻ>(o%^ 3S%zAGk%/&D㶵Z dմ+g^¬HѤӹu W=X;e&sz|ΜЦ)ҷgwU]DEβWdVG`X !ZK!Z. I*12rMhJl\ܪ -+\χR;pÒdʯ$,P`{!Z|*g%qi-+i c.I1rPhzI'ý2 i`!irKPHOPh^.m6Y #ޅkY0Hl -(GPh(sM{/:l yVƲ\_]R KҘm5q,z]s^tuy>ڮ@mMcGo=>{u (H?]=뮿-CtS5V1ST;a#Y&R^c΃g*8QJJOfp:zP5rduQlX2#f!j?@ [.\k$לo4 kBbkR!Zw0'?{F/{Vڱ~AʇMVJWK"ԗjblnأ<i͂g<]SԥpΛϖҦl;(i,]_}uӪZ(:w=&(*ϊ?Dx^+ŒS&@1͍-(ʽ'L4|3Tt_y)38ooq{ e*߻U2ޅ%<%Gpmc8hPHA@+u z)hrz0qfAht]PD2V`CAe-k`#L8-0i RznIS`) 8 8!wLAŘZ|)! (b^|U @m1YzCnfE(1,I"axrR˂:A CS":H@rˌpTNRJD2s-sY'0)}=SdF ^=L:Nnx ^{JCŷƽFJ#W[0{:'nM~̍8HCTC;?S/'s͐.q /Ԡ!pºg{甇. n]jvm8\BPK6~qnDѬ(Q NjU ǎC ];{_sM~m-9MNVUTQAF7apo.Mbr\/^8AGW0* }YYY?kW+ o.&'ٚh4S?Lɔ^vI.&uhoA _ BFq$!׏t4 kFa6pTOHLŬG1mddQliiP=8G`.#y`AR`!G_ S} mhpc6ZӶC#I7ڴ760 |kc* ;.Ʒ%,ggҚh30v+8mRA0{D=_J B ;́biu5!|r!YaVhF m-SeWv"A$h,yB E;ކWS8Fgw_y귷6~y޽lmEW0A1"!1\\Ŕ|2Dɹ8ᵋܾ-gQ4>`C%$])N)*FmBSK@3gBo$K3#͝ O9(ԟe>l+2\/.$B[B>ƣMt1Q!wDX<풒.7aH$JmhiUX&&MIhMZs-$}T|D0m.h'5m-n4rEyni{RV(s*E X7!)S 1_ <1 Ki9^\NkmtaKr!^IInY?sB䮬2'LLD6Im0Z)˷$/"Y"&xbT$%'hPon*?Pt`PQoIM"iBE ϖl ϖl ϖl ϖl Ͼ@e-----hthT+-VZ)VJRjZ)V zgZ)Z)VJRjZ)VJ5<ԻK7Ȭ,e }<_`JA 1 D[fZs&)KSA %' %o[<7g7 ZBweYuװv-TQTϿycGy?-K"3 lVR|w̩Uhg9mC:U L+`ẐІ ‰dMF,Z(] |< O07:|4ؔ(İ$AG`B@tR˂:A C%Cv{"=b.3:R9I)u wy*zűʤ\X]>WYvQ!LWg.{09Yv)!x]*(5~{^fA #/Й]gZckC;?S/'s͐̾1R@vL 랑VSn FMʚfwwg#Y~^P_xs19} LFɜoaOKNp4t1~ `Z4Ҵ|ڑ\?i0yiU,3 mޟY9WcOg4jZW,Fpe$,}~1tFh/sǍC;vnGG~x=xso?ywpD97q~B L 0Tz o5 n ~Ck8|h$&C6W=#ƽ!>/qzmBaŷ7?9/rtzCu'. PNb~Z[9fgYIbV4[a93^#@%~~O]5Dm7/n 5f5UBsʑN!70r߃Dh*)Y6r6ٿgPGŭ8s o-Ȁ̎O((-#1:3HJP&g[oc] ĊlfX ߈NA\;?0MhgW@h)j~-ylƼk)MkpU_I! _T>o))H*ZIM%xHeTpbM %t3^Udu=T!R|0{X(A 8XI:0ېk$ @NP:.*- KCD!8 $'F("B ZAܼ: ) M@h2i -UmnR7z.0P>zi{ʱ J;-pK;-pK;܂F޵5mdLvT/ʃLRdÎnHS$MP{)"EP@ATʼnE/B܋Hltnl vnlV HKylQWlvnlv/ijhҖ'f t'\$7jh JԤ$A R#{n[ IPJTPy.aL0-AX+lFT¿D h;emvt9q~MafdL1@]A(I11jJJ iTFBo/Aw!vS ~+tl؍(g͹e+c( ͻi: |ݫAy-Q$(Nc*字/YjX/An&.~YiR_>~% T=N`l߲OX7DkcV>sgt|4JO"hHlI͚pҨg~D1s-簨͸(Bxpl>Lo%զ mv|sE˶0OXfZ=o !+X d(,yb6 Sc ܝztǫo uv{Etچ$?TM&!֊roƋOb\4NFU'<~=+$R '+ľM8h6r/laFJ?}Ś $[?⽽f]szPxߢ?dێZ]/>O;݈J]|VTNgw7OZ"E#rA>TbF:B7Bu+bóvWݠ =_EiD &uJi0E˒ph2g c2^yv }6yM6y]욷us~=뚷ln#o(`TBAkbq[e|_@tG2o"? =TVHmCd0>_@_Jׂ@ y Pd0Jeadh>7@2Pwa:Æz(&c¶u>;ď|Ԓ *pJ$aQ(N=$HcɚG.L;Y'Pw쫲Lyċ7Ϡ&z(TD q i8#Hd^fĵ5}7y)l 8&PQWLZ)SkJ_rEu>y'qWtnWoJpmfAAt8&)\b,:MvO#D΄5TcϣxGqP9HGm)b >(llG P#7qcd:O+̓{͗.coZ\xAH|c<6UHPMbvE_M0?.#Y|rT]Ov#قjE;v-A[;ט5Sj9=ǫ.O8hqy=[1;UU`ë^nFҜv?Rj=kuDE9ze0] kg"Z4(=-!x6Uk^sOySZѿuyrm]bnWc1]l9^fr#;Q)!pye.c`*΀?, 5cځ鍦H~>Dp>"Œ=su{;bF"Hz욥|"6^VNzWR-;¯ʞ9zͣ.}(ö(=oo֪!?/+!W%@99e,} TFgx̲Y:hTz+#ER!/ZyKuK!)93{jǜ2YF-Z+I3!7SĽR˦G:K0$7,3|l|\J0Wd[/\rg*,n֠tc4[i_5Y-MAi$(?SaQtu/m Q\;cB)7j8Oe&cdq(4/B"NDqۂsw;n8 íl5ߥcnfic [v )JH@ .Gb~W=&3gZ8$1%{ 8vOS0+vQm+ͯVt|ԝv.0 Uܨ{:̭Xye8%;S .H ÌМ ͎y%C蹡lэϴe\`>t_טMg9kZF `MX!&bj[?{w z-XBjٝt9uyhﷻ@֍D#}@(Т$p=\AcʪqhȲ|eM{8/ʳJ„dʥH+M:m$8M$퇗/'`ΰ"<['n6!1=GD-P𙝏)x nө(#RlN"V4E+LR 'Q$RK"\JlDŽ 6TiRJzk(zйg{b ݧ&i#2-J+}2J5/,6c>Y?(]:[<r.Ӡ;g8Uua"ҁTXS(caEw1l0V23"3T6)`:{) je)jI=c7\ ntYHQLAV׃_?|S ;ׂ1z俸/T@/yEa׃_z72U>+|k BCŋ{қؕY V r_9b@HL!CR J1C: l_%7 *:">#-R.-^&n#N.? EX>?MA>Yn03)Qߪ +G !)!ƹX-r9í!)o[O7'}O0|-6T}{P~=E'?Y\U?xv,Fʵ\/I6]^kId:fQ [w'2T9Sxk}ӐHwuayJdZ'`bnI~h~2Mu^/u$LA|:0Ml<Ao`IfW s}ݛ>Lԇyo((J: -ѓOSK0rj iU^c^śyk^#rQ~͗7(a,e)N~ӽ,/Vp>y5."pc!Uu{U Zv Q}8Wvd?>(2_NpLhMw~[ *t LsNB*;C=Ncy R3KpLd3 ߌARنK 3~m: cʤ 6#ijaI/9qM"(1PC/nf'޻L&Df<pZ[W;?`ꔣ^ҕNͷ^DyﭥM L;pUtK@|wM Q&KF MpL] aw P*:X;s;{ B9QȢԟlYOeBD`·%O4(X{IE^׿Fflw ~ŀ(2~T% H@b:,*~-#f,(k_Q*BrMB_C $_<᠈^6u7烈'KU^B%ml%$p)WԂ}I75w v<z#Ewph3E1!ڇAlP\vڶYXj} X+&-?6B6L!ĔX~?6mĪrɦ)_#rzaYԵܷN~udq>}[Zm(j+R8o6(@Hx6lOEzn()m+g]nČ!USe.廧xNѤ `ޘpԲM;}qڊ7o F-e\!m.Zcă{V*+;[$]+9vN*{/_)ʭ B7|r UwaNh<-n&Us#R-<eD_ ],y=x_zM?T 3A%|w~lX뽋O%=-jH!]%%5X);O:CBnњ#-” ~>_gIt7X`fvBC@0604ȤmeINHJnJ"ea86Y{WM6f<(qIJ0_ rW zƳdur/S\1KDu/2K*U*L*y1rQOIa+}_3ĎϢF 1`FO MYe2Xh?zM:E1oEKc6byimWIĖv@whֻ}֮m]j{?jYeJؖHS\5D븃Ufz+Jz6S25b54LJN_h h]` p" Q[_vpSYxƧ4rn=`!r hQ6M`GmEUSBk~Yک 8~3k*-6tf(h mDRb>քrśjr(@/J[kfm=X}YUXx8(D8 'uڔrT M^&D4H }ADzrZ#Ńр;Z@`-0h@ 9ʪ{8p! =fi,qNH"b p(_^ք2x_ڄ* />){jb7`P|JC]3hN E:-w]]*s`HoMO q5o,p;: O3yiEsvw`U@D* {ۦԙry⷏깼s[Ƹ>S[i~C秃J.-cL#u *yq ^_fT2Vc׎-q-B5 ߚQD+C\&nUg,q\4L{Ivu3hwakklͭM: F Zݬ~y]㵧`Ƕg&l[cn[c߆- ǚ#?4#w#g&&õymB7t9`L'w]$U`wFS[ S@6T[}DW_[A- R_]L&ra[7N\wtN湀CwuEK-~^L'dП9st@~(CsPa糮tZR&G0$@,^D`g I?Oo_E_[(v)"o'Z|6eR\q$X*3T{)":Y('CP廚hA!̩@ %MGnxbgTPe 4_MjҪvžucf-m nkPumWߔpA>^>yv&ţq~Cޠ*٧~(6xP|ߓ3tO?g3/CxL+10,c:9e:/!rëywъG*PP ϼA85A:0O^(Pb17.Mj$ .Rb=&I5F-K2*{BZR@؄K MB3Uv h${ ̊HR)Pa]zX;|z!OQۙ?3uOC +SeE'zÓp $&96p 1&q7 n ] Ӊ>s;Glg?Vg^'?M +VڟDahߘXv0I@vSTŲ*N_/"Loo?h~js"eknTڈ-c\F`q?pXkgljWLq/)&ݧ1o{GΏ6SvpV8tVUtmvW#=͞v9g2'),UK{ Fr+!d=N{fVO?yz:<q 3#sGO7w7N{T3wB4M)CbYl" i RtH. &cœѰ c7DҖO㚾zk XK}g,Q -A[ o2DT7VjeX܌Akm֌ [)R:=t{ٔwhl.|yI<9[σF&OѠRff} [h"j*څsFQ,8 SJHtqb %S|sssInTzti%ik2=d*Z2nzJC ~\/ lvh ex3ZsjᶭS%NMeKl) xDM+O(loڶWb`KSSe%YtP|XeJlt&'Μ:J"UIXwB|2 p\wP12o=@qX8 EH%k~ 5Ě_ˉ5Ev$8(1N[IFąJ S\8 ٩JR$qbMd}Apw@(?#XA3$3;fܷNh!DW @Qz}~_x ҵaV(EcfP'&!1Jg",Jn!^"ul9&rc64ΓSER%kMIHH!MI+`(`"ڕK[sō}!gG>5Jeϛj8iH$a\_x{]'VWR__]ZZq+`w]k1tY̶I>Ԓ-il*DR׋zt %q:å hLD;y0Ls~Qʼnwʢ]ߊz#ꅒ5JHQ$.]Ll ) #Liag,U&kČ," ̃3]!WCw8 }匞2~)qQ(5kWj](j8RsNM3)*mou1O uQTg:mgEmWn󭻻'|![hf4Z)|]Svr|Dl+\9"02?+YJ=aKkrUsyR}ּu;ʧhh~u:m2f"M]-{`y>sa:!U HSSUI2"RL oX9$,RƤHl.vԇIХIp},__ޝJkmrbb3, `3P*"7\h1O-`)3kn2:M۷apd.6}L>6Z^u?Ž= 5M]O\Ï䏧8 ~=t9VO,Vھt7\zmYM]_"βnZHi!oo d@-Ւxa XnKh[ -h^۸<[<]fwK(%̖00W4B3VKg&a5\^fL[P*VoRԺn0Im8Mֻx`?I{])boOMt>]EV[4b6~O+yJEx 7< U}nj vRjwhsMs+!о0aS ~$~캣*ZGyŞΠFE!Pdg]3O;bq^QĬ$NZX|[vT m (W&E[Cs[.6W \™-wՇC͌WdIF?Sd-+L!d<ܫ.o9 ?Kc[r5Hkb7_~΍>ʘ!ƌ9 )Knк|C|c&ޤR(6 B^Ii"9gV S) xI͗'(A[*aY:*D{6, /3:^]!kw PMXא8WCͻnR@fջUq$DYm<@MThy"KԹ6T1x YRoKŹ8"\ \t:)юW$icq('1-"wȈvȏ>FԨ}r%ri:EnS:EnSTAX"66 ʕ%IP4(JJ ri Zy$h:9L31"ִO3q~̹1Պ]$/"X*i!p ì."(m8ϛ:PO`uԹx~N.64[}Km{<~C{Pաw&%@Ѥ"> [Y}FP KA2Je"I2226zM!rPpa!6a,(7$M¡U Ąc Gb(:YOb*G~uG@4:A X 6(@ ʝd"WkAɝ+ɣ2yY5S:IFǎ#n=_RуiE ,( TFr )(t%]#x{4GNI RqNr-_Ƙ*M7_  iE(3=D0H I9Ђ#R2Kԫ&(;o1S*RTyΤeJQ;#$42BH` .2iB.Oɨ͐=FZ9 g3Y`$#*RhhrQ({!R8Fr"&!QT2}V$nP5h-YYR](fHn.F! @T)qE@9k!Ig<&1Σ!*x)Qk*qSJ@M.!|D$8@M9IHH(" NQM[Ʃ?(vURB6e++"@*P+\ԶpVܨ┝ R6iX;kFWl2zoD67Qd 3"Cm"83*0=Cɣ: $jIѸִ ['F9fcfk.c"+\S+˥Mml~|^čgjoOA )1W=kKO[\rRh#Wyюe[7qutʢ?jS;#ԩȵ!.UW0XtxT,G ]?d@GonEwtӻ??G#bJP|7C?6Wt@fna/t$IqΝADSΔ&M|ߏzw_x;1c4?ZzM ps /EfZC z3(y s7n-* ޠ3LJ{>r8ߏ=T|PcsD;E[fQI%fY'u_GHI:^G4'KRW/G]!u PBI]} `W{鑼u>~4 z更Z y$A{ .i>7;H 'vK-9.%K>_epoWþ!{!Y2q)9T5~z9̇(o0PF)nA9ĨMH T7=3/Yy(aݜa,ʟ!O;YR܍6>\R(-9a=~ss'FP)]u|2zwaD"˓V?~NZ^t%|xgB*m,|+^0]RK3}d=NVpp+KXT;w }M\AƗ'zhj&q1gYB\.@ ѝy݅2~Yl;w{8= Qsf.y|:9m S* 0ShQ̉QZU `N<T #@˙]Cu 'Pnw7:(c#Ky(H?N?ɇ1JKٱ\8SP޿O3wFnj{8*pHWp$pGOM|6Pԙ-;Q^E9zE%83_ԧM4Ľ%^sLhnPTK)خ!H/A5pڙQUjR |4rgzĭhjm\-e`3246}hBc\(J4%r.Sd#Z*HpvX(L. [~\Z5_ >G63j-?=bqɐ<ߢ$ 5Q{$JT'?]B'Bf5GS?w`&dR#]:-)}䛶(BSTܖ|\ :qq8XzF;.Fw}o{nD D><72QtFlW2UK2I\W幹߻Kg1b72sםeނA h"@,x *A%lj#7:lmE2mx9lKK尻>;z:m@?2UYȎ3+ԗ4;ēGB)[&nDJ5lOk:N f \6N"(cCϾ r?8zŗ`@%}:á-i5=ȝe)8kdalHEiS Ml*{Xo<5TJuV^8Fp3Tyac# i p)zdi{U%9ε_/0)w4\i-4ߞ_&X\%p@H߫YYo9dH.ұe& f-r (9f%#Ҹ0ҼbsMܱͺ :0W2(aѢ pֱ$IOZ<5*'&YL9D!ʆDQ:8^(C@BJX [{.󯼾s)=w=8tSʞUxU~]6 ,jQS xj*)HNpsm"asQ?jޔ磰p.^CNt-hCð3+~ ?tz介r< d9A ߦ\*K84RzZ36!Fǥ}n#`(p(Ԉ3jF :CsK<;=PBVv=۰q ;B T㳝;"0UtpJb*Rf )9BDѶ0A$jPo<0>^?Wm>˅CQ&(U_&R')9|[&;\yi(zҬOrJ ִEj:hADг pFtdֳG}vUONOOY|؟bayށ|󗎿5 ; |]2ӑ9rZszAMt6@gI+7ӬUg^$VT2/J[tZg=pݚ"1TEu9zMPK j6Z8A}(WSgOh%u~8 Wxf:b.o2wn b[˖Xg* [Va ꒫&;Q) ^PPvV+f TnAL:cc?3.`yn־~o.y hЉ\gSj3Z\ڒh*\_WXns-j >d3rɖL${ n﮸{B*Y.~|!7m&NȲCՙ;mֵŻFxSjM]O5G!(4$-i]D|~au=9I9Pla29TI(PxobaQB <u8u1bbw&98^./To r/B)'; +H$zvWsSm t`7P!|OvC޺v]n7|kB}Vhua]zl`@9d&9ݕjorkBdĉmyRV uն?Q(npqz'unDCu.{ F<(pd;l h4Fi^ g}Gth P$h2稸˧Y6%-1mW3/yrs5 ,>rxp^D"c4:><"6B! -#9R(įV>(7R8A!|kbߥv_Xu)[!@ꆘ,jj_/C@Gt(C\rhkٸ}+5'@?{OܸT6rË~6ttT\\vɶJ]K$hˆ$(C=2ܐN`w*i=٦DWMLjnQ==E@(&8q.vJ %ͻ("o$Tݟ9ڷ{^*8f(ǦB YNR^SJ F'_xsxzE ODFA/@`M|]'(pd*G,̈́D,3[k z`ͳ_HA˦M΅F ldL)EIKlh1Ny&GRT&k$ jR։Є7g2%驩ơMrUڦsc'gȑ'@jA\j.i&Hs9&oaS@OݥI(49^bLxꑃS'Z N&΀ z̚%8Xc/>K3`pnH&T6g*W\ %ʌpK\YPV30(ɔ SW3`b7\/rMhR=p6UDi/1:x9B+yC2Qi 0E?gr iSJ"T mmV4rw:Zgyձmh9Z‡*Z(+0e)LDsIuVhaĹ*,i.P,7FP+1,t:MGS}$ F\ܦ\^̴f91HsK̥F͐wQp\ΒPř`zrm$wzx=낁TLCNN 4ҟ~l$[jt2ZNT>8G9` 岤BQD<[.%e*,cV`Qp< ~ʉS`wh6w%Hoϝ//(2T֐95SprKVJB%H"<iS^ p`^xO:ĩ8Ubۯ6Zj }5O^>^zRmsO*I''@]v“sBYi41q颺Ql4ir&'wmLir!&'w 6MN|n7ZQ1MN>]LNn(Kr3!49y~ &Yc |';8k1rMh~ڍthT`!>5KFn TZ#[>=p#.ҀvƋשs5gc2°ׁw< $ Lk3hƔ&GkBr~s;a =zp%L) 랎O4d]F$d=W'* 2QE؀lf*-BT*'Tb9ŅdvK1dv;ƒ}g*7&~=W C\RdnqG8\a"˱kSVUM0cQ{|#cTryp=4pp5OpM'k5(TyϧP̒EHS_UFg/q|{i%TT1 5#G!G]%%'cdT1?]rLBGuiSWԢs;}]LL'`q|Fa_bJl S'ױYq{)Ծm݈,;{ ikSf}\O,{"6+ %/Σ(8G?,Qϩ%/g\/`uzQy[@LJ+/i`7O{y|"TFZE:`qRR*+f_z'X'X}]B'W) 4">_;Cς9#b¹f nq3~ xpyu~( n ׭$R,Ӳy/3۪̋먝̛9iKoS$v@αHq8I.%NQǩaɀ~mGl!ɏ8{,nVQv`$~[8sW81hS~71`+;Ȁ埆y<U,w~<88ނG^y';*fMuM%փ-)>DgrB!1(#!g=^\4rqS+VG}uͯhF´@U3ĭ6HkTd]ԏkYcDL˔~Sv3,/U5ps\B\YKϔDFklX$mKߏQz ١:C:j6(ߪHzvvVH=:Bz-B)on\~dR ߎs.~\W?ݗ 16Ivc ²<]0-MWlwшC +%܂W4l9Rr7qWgT,>"ׇ ͳ0[w$?W znYӋ6Zgo5{\R-o7Y0{&̲gHDf['vjB-{Pl5qsueqIjW* eF5FD`ϑ]dkāݖ=ZoYLPpN\ƒ֣!E2Z[%c>`VY.;{SVn(ɻng#4`y3>ejȃoj8xLxe ft輺H[uۋ6rs۸\OG!Dcr) EDUhxosUY7 5}r& 6G3&iA8w&ChBޜZ l _m Bu%Ogcg\~2 Q8dS%w6&+1GܴRZKQD,[ETeJN$)ˬ(qeiwvfBSRjЄV<Դ1rFQq5A]#?%s"Ly?ޡ)Hϵ1]7hú}:x=j!{DAJIyf _oGg]~$pY-KeXEM)OAK8H  F:vBIܝ%;y`0n5$<7"WHߢb$m @/Ec/ٜp WB&t?gF 0?0ǟ^|Z><.S5,? }h>o3t Jޕ57[5 1qj2I!T,Z,e%YJHJe8"ۑ3;T"B[x@lk?mf}11anד"ؓ>Zsuҗ,UT"x{vP3^^^^Ƒ~O5 ]ژ, 19Cw}ʝ^|**Ɋ'$T)!#R7dy{KyGj#QFLߟ@Đ9*:? u=鞥D=?OJ3Aֿ39]{'r;L l6Hؓ 9b9H- u5N~r.fI(<,ٚ,jjdqq*hl1esf &đQ$Di尝SpZ|b kss:TH]J]s~6IBU d 7]V.I*c"/A^M2'%A$U/VGބz-ϣ-*9*gױv6Mp/ݙpAk."0'~ArW"WlpZ`܏Na(ꦜӏT>WFN$ObיOq@psMm,NGbSjOM!Y9OV׮+^e4w8#i"0 (d0/VtlnexKfjl<-y^CIrPjex+Q_qtI35f '>8PGd8LRrGuf97ZXn,4g'u+ȓet~Z=J"5.aXoy۶cɂIM砠z'RQ{ 0J=<,e#UN3MqӅQ(M D^PKu0`6ى߶Z쌥: V|J^Zrm5f4llðMxSw3-/MH|bLզ< +Ɖ?|'(;x$=|6~iuOŃ\7WYV5zWʂ2 IgC.d_v?u-x^n~(N( C ʼn5*Cύ{Z/)DPĤʄ=A)yHEhZIRgX"xvøfKU3{ l;E<]\ 8cvC7|:;\嫒B& 5:  r%/CF $I65[nNZ6)'ލ^ pHq!Qʱ|Ac:,Dn@PpC LI&ED(S5fEЬ2@Y$HOgWs)X̊o~mxP³=Ul:oY`bC J !Xώإ&9;D 4*8V X,f-M+UְF5ƌ'iS1uef$YZ]hd,j3M;!5Ahrfo ΅=3רr~_ލ^dŻ8O UǝKoQ#ņY?* Gbkb +"ѰFGތ/ǥQ Vl@˭X-,k_l?-VH|6Id^ &h̥';qXԣb;*VIZ,y1H@1-bvǩ.Ee:.WV5gyĢSD~F醈i;9=NFD{+k "aߐaL0dXķv*%NaNZ텎ZcG̚ M"?+#-oi[6_:^ѷ BV!3EeL1qP*q;xp<")x)9ޙJe|V~@ZQߩ_a&R˫pR;QǼX%}'^𮤇1.)H]:kLVikt:cMq8pA=tW)h͒*%у^'?OdX{eU P-6WE'Wfޓ kvrs*؂4_N\4ޛn12e;o~C Z"f8n>z@=u`u|bVԻT{yqzU`S;$RC: 4:mu3owt QF6\DQ)rxi'B5];,%csY;ObTMgvX "UN=ϒlTohWN|pb0HL{]/NPV]Zq);͠}.+VJ`|,܄TMBAIG "dwµMY$YUz/,zSQɫtQayfƺ\@QWZ&ˌSA lW'1jLN?^];i.'Dtݱ ɣ>$B[I3PIA6q ݘhHk9p;GF ]ίa^HK[_^ %V3#l*c"˘j9 0GqckB8OL +F+wKRx:ds|&8I0"F'avygӣlFQ̪dR~(uirUctcp񲱶A.q$uf :ѧߤl|<\Dt&JEOvMˡ&CЂQC•eBwb-Yh鲱+,uV{]Qx/"*K1 UAR0)U$aO<φ m:]avmEP<- eI2uhh(P킒 ) w WI+ (x} ?T"CR1%??pP@lbM>R#ъ>mG0˴^k0D8~w2 7")}++!Q,AFDQ4ISK×4ጜ C6Qmh$k>f3>nk,cVG} h[G1}8K 𕌕nn7р#(?Zu"A{xf8jc#h3|KDSɻ ̷ֲb<`?O]̷^ɼ(:= iYB ]t h}^}RDȷWpଁ8?@Yp0;K`QN YF9^i olqnW=zhn:@Lc TuDQ)EDReMrp2 %ax83Jt*ѥyk9efDS%9:F>ƽƓ;F!$8*"˷>k_%ADOǞSKW`q 1nt@7sûjH5-[ěL`:|\qpG Q&j7!n_?Mcԁ ]}CAH {NsNAd GM0Νi +&m_mK#5xdu3GX[7taJkdkʔLr;)Q_FP#fCozr`F7d5_pq5~fȃw rGYc J83Awiɋ7-߯rv\\aO? E&iim7tWmMLIxjKdLfTM0ږV\X!8oop7II>(s;i$ŸԟÒδ U%5 K`XYiEMgy'4#xCqKŹ,sskQX\Q`4%5:(cBYph ~,HY(2|\UA|V* gmsyT{xw1QYPGOjLq#0`;^ׅA>CU8\DiX5.pEܽڛҹp"K >Ȩ2Jح܉ gD]?4@Fczl̗Qr;\Rz* jlrcn#+%ر5 dm `&=(9qK2-SNe%/*JK+ Gr0WByl!_ְhz%` ڼwC0hqN5>3TTyIT&N9f)h}uP* Zj^nT r*m+a|tsYcJ[rU-d#)9yƚW]2[S@ ^uu X2 (m~ԅO JȳsYԸS`] S#LN҄ǎ) rƖ$=r KU̵ \πjrvUz{k\&+x(FQ;ѝE1 -AQLeeW+­Vա70A뫻l얓)LWmǼwh}.RP$P &h'|9(oNy[3Gb טR$ic$1P(/½I&*{!t<_$Vy) x/1 z=l%~uxeru|5\Wc3 }PV2G)[QDN\"[~kW.y`H+j2^aVl2'QfYP*M(#HXƲޟininYG0-}?,I i0udr7mHnhSY+;[c*W׽jɆRִBmuU{ы ț+:b)j^05u=|j`f OyV?/!łh'NQ9yrp9ynARK)$xX֩ģC#B˜ {q#R^`zll2̆ qc(֡ mXK38W4OZؿ.?Po@{CDG[`.ӯC*~yx ;" F5*S hHD`!#0T("Wl[34 q8Mx"l:l|GU}"RbvDP.*;6EoM  \mUVXʾ<2< h4x~7PCH2?aԩlSա\7e#t4HƅH> 0ֶ*\QNI+Z7J -9{('X@X*/nNGU1L K;9y֩ r>2U%j'0]C/5.pr^TXy=@CVgbtnw2m_s*+،ZQnN6Q 0 kƌd.|fj'wi՜ۨ}i8;L+ܠMTەWח\Wi~g+SMy֏޾-MX ;}v[[|@UޕK;p#0~S3໴m>yokD9M4լeXpsSfj^:DXK>ĥ*0=>IMXe[5R# ST {(gA/L.U?gzzxz2rs*r9H5`%cV΁NSmhX6Zug2+-H1%@7TAq];(Vr*(tBm()XTZ~]%[iE} ?gSRۻwpKsҽ_~8Nʪt;d\|zKzKzKzKya2WzQZ " v 9 ]4u hsWn.o–^1Xv\_Eb mOR u>vE#7.j\AkSE(ʽ' 53p.p QI;|˖~v c@Y&K[:U0Wh䢵$jy4g~2(\``*K;I^d5\vVEF-HvP~2~A=چǃCy`w>帕Ay ?LRM :Z| (^K#y'UαeT1vp819,3r{v_XfJ $Emɾ/G'ߪ]촟]x0IعM1A8R }YWV@oI;=8i'ֈ$ sFC, pDT59ҕ]@i$~~G%UrQpD3t2FhH G$ӐU&"hgn oKάG#;%ꡜ\u7ӓ1JR][>ڵI;Ȋ=Z=9$|1{F#PmWDQy*x6l]#=aXvT#xe%J~!؃կM3 MٰwMl 8_0-5\]5eS> V8 ksdi@ZvU!л{5hlx2c7_okAQ1FtY^eg-"ú)?ҡj9ebKa M/{2c+OKpP(:#p^l4yAԙm|GH/7$è߾VU /:% yPawoG}6~P7v)~Q3-bx.ꮞs$uc*j[_{łg2JsڠZsͨB:bFPADVÉvEƘM+K >$ F`лr/[K@mP4BG,;.Fv!';t.rl;-f$ (1=0L9*b8$`eWͨ}frUͨ[a҃~M_D`TH Gz=YhR>~D(gHxU2cP©>gRpw'P6r4|f1hyڕ,ɚ| o!ڃ2nK"%hffI>p8~6Fd=uVVBMA48q(XVX4 afLjޭT1x|f(ݒemW菾eE5F[:VF`QWwgΧYg6YÄgPzU0(H+NfmDGSI؄.c7FDB;LoHP}J872 B UKlTT+!/+Qkஒx=H3em*L}RBN)On!T_ۣ2PƥgRe*Ds!ZT&zv Bf7Kڈ~QaF81gzuM +~ݝMyR qWk4B^Q}x$>Ǩ63c&2ˤTǯ/lD^种&fdx=Lzf<hCL,KaᥭgwsD!:`$nHkr@T[M"#N~*.Y.Ih6 B KH;sկcz=AvϘp{u,1k3j6H ߺ.vYaSमbrj'̾ՠ'c LϬڸ,2h8ff?Zf$#ZJ3QZ` 7A̐TܨzTvE[ .NJҠVMCg[_jPd8#"PCVoLk)3*-7_S$&0zl!~JXF6[eϠQߎQ݊͞jd1foE('+pYZ;;@JpWPd{*BFs(!L|*d"쀑k'8p .9 !XeM;áH۫V"St+Ǒdpꥋ^H)o\5vHi qs8*=u\VV轱%ʼDnsuij/VwˉPH,_U^EesNcJ,X5kWem]¦9,E盕wom{$iphfʰ}k#Nr<$d62U C>Y&c3Uy8guVYCX ӽozI) LQ '= .O'C3\ ~%;O~ӗl8M>ǜ]-2q(ӧ-H'C]ࡵ""Z^*y^"})ZGH̠_?* sB1Y%4W]Yա2XkGtf~AnvCDTq "P ~1x1dH"[8Ϳ+=7=Aaey:O,Y3Y#xȗϼ3&^Ç\5+o;XF)\bA6H>k]^BP./a]-0@v&^y o?.)$`LPѧ! BP{n*jO0ה4<֪xz5zpu`5``. 3# CDŽ'=DJ"" n[1[W!oނXcC`/ )(v5`~M5d 9ć׻<^!D(6wWzD8Dmz.Cs!Lh;+x8 {oyz4QI_|+^ |9-􌇪@aR!2A2BEcDZ LcfPqҌH1dq"a!q2*0R0GL#I Ipm~-\Wyvs h`k ~ͣqHL^i iI4Y.>`6ޥD^ak\$. J nۜM:9E ߫Q˩ D6@ ioB !$Ig$Q=r#U/uY, N-5Eq0Fi<~)VPߓAE.?\/)*:hDwjtkoo|,2D)9Hh(wy¢|}\&pP";b}j"#B&@uз.hLk}f T 998J!+ ϠAۣ [O8/$!qE8錷7wEgW &(w`ABW48Ur "QRg ff UY &Hͨ[w&ۙ\kYٔܠƒe4l+YqEV_T44~UͤeY}/k1[j6S?%SNpqqGp(Uk{ǡ.3XQbU菒F{f7_aGwMnh *BO#JmڊSi3rUDk=T8d Gb Y[ 5~ǹ LUg^,I"W<d(0Y&I_/x1z)4AMf0BHTa]/YBFO*(Vqu%⮂ W!<+4$872^IܛIL@JpL6RO>p),{;)5D @TPeOg_,&k02vo` W Ƈ;K/I,BIi|!h}-&]s-DrF.I.7`R̅齧E R2cs+O>$s @0̵r;F%J6ծqPHz/0Cx(Ju;}&7ID[|@#VԖ!eRnZI!1G}qF!v=p= CL]ľ]R}X(B#0);O׍)a"Lp֧F\k%A`xs ߉GDKߘPhvӖb#~>ɇ=ޜepg IBY(0J%+0dٻޟ8s$ ܭL\vUFLvf43ps`,$3Y妁!M&D@r*XΡ)ΰS؋nypۓ,8wXqbM,8{8p N݁2qbaXdH6~(o7R>8[^nvKaկwjS"1=9qTG#?;?iL<[RX\}_Jtwr2DRX`0Hp  ΊC}躕3ǕS#lHr`'X`'&{nɓl؍'P@p\ޫ<׸(=K5(5vp_Sr>AF,U︚6y9;OWvX|gQxт5[OEEBiD9!C5RKݰ6Dq|PO2Դ}2g3OP̔NӿɀSN]zxhĬɏɊbY,mU18t`&nIT¥`;P,ZE:< YNBTJh~HCCtNP16xrS*aSt~^%l h-:λS~`;.FȋTLTXFt6׺RM68Ψ[U0dшD $$3%ĸ*/#p؝͋Kī[p)%&ۊ,GQ@D9cI֜gg`+{sHEc96ޤl R W*r`l)Ƒr"˞PƓ2hK>Ac(.' 0Wk{ʞ+z ePc̛G؝0^"i4knFG//|GjTWW<0߽[%z^*}*ocNQY"'u$CtmwԐ!fN@ {9 U8$Z/Mƙ!D(s(7k%{ Y<%Ė_VdK¼>L)("m~r H9 z9D-,|:PnZu߁n8yBV ߿/+2T( 6Vf*ڴPi$pxk8-U8uB[btXmnK_52CJS#P4ja#2k0ε1mK-˘ܑT RŹS15gӣ;@Z>~3f=LXۿ^{y/xqyx-+^;ˋ'ם_Ύ|z@>W5Nf/$Nf|o5n˩mPgM /%h;y#B(9V`-um7po;FQ{xr"]gICKۗ8`^>ޭSN )H(BM`0 PE1na{s捀),$]PlSZ4ܖ0ĞzK?HCډ\E)bvaKֆBIBh\Ն1V:$Ҡrlm/n֚bWtX͉ޮzpo.-3 xqvv}y}zZT7Kehǩ~:)Nnxj*Zֹ_MW.OO|n¶ Ioo3D]ʢ[1eFg_ `ڢ*u ^>bKd Yy/dJxLV@dem֗ܒfvǼm9|_#/`΄,6`IhL9,N׫nfoUxO0-ɇ%Z!KU~6iu[-r XS9@ɉ[l[d'PjSK3~B9@ov.JvJZߓh:PyRVce%oFNȸXB7;e7*]Iƻh)q}kӐ0>0s";{=2Ғ޽s竲3rWq*7gm5E5K ᘰd'XSĺO{^'ahu+ ljTo\9fy67&+u--%~IARk&CdG^7-E~v_Z9/Rܺs;;m0okv(پAZlugF`# nkn}#Mɱ?`3~,q#~ 01|E*0`tl8tġ<6;k(xpKt^biո'װ+E}683;|yQh|).waC?eF厒^"vΡ:ű֘~o'r8هlHKȀ#qʉt# ڝ -Fd+hc~爌v#Bt1"bH8u+= 32 m6o9b%;kqO5#Cϓ~;Ɋb|fuړF\}.!/ סخnI֪YQȚk-{pybZC 0`#J5&0*؋PR1ZaCYoW35ZlLR+!9g!u9R4^FeIOƵ*T*RJ.F^]D [mSl"4Kv/sjܕ,yjBL# S$TFˇQQZfDhܴר8\ٴ`+Z }OuS`kolݜl\H &sc]˭7Ғx>ai9#X ˨dp#=@Vkz{r" dH:$jz/"(u"kB7Pl.?ɷ`l:e]_/=S}KTg[恋 5Ϭ6416iQ}!1A3x?%B<|/:2"ed yv\Tu_)lNK 6 . b-*52V>&6IUUEc6K='mXà <!1H@_o !Xr :vfb`HHHR Us_d;4Ij-4҆!s:wghV\7މ%'bFfYMpO$6Ex@ ڛGU|/}ݲZt%ipG lJ'[Y;+ɽ wYͫNoHtVdłXs[,Pc_ϻ%ug֒\i94Bb/d0 kK0N<#ތ}<^vG{VJjS]T~6w1ug: $?/.R:李w׸ )M0@:mq^>1 (t=Np^r 5><س#0W%4=A*K.I_]o#03p_&)YyOs"ƽϦn!<!(/ԫZq[[E-_;v{ty1Y^ʼ24%]Zupv h#"Zhf˕3g 9^b[iߥ֡Onݳ6"]̞`ehKqv^L-\0IgW>a4xGNUOWbJ0ジ3i+= m:tTJ{o@347SD,ۛZB> =aOULN "߹x$*30cKV#58Qm^gaFC*W-Gȱ>7dpڷ6zAnk^1{"-9Y 0! -$!iN2e:$qP(cL[M s=Ќ9У@çǑQ /%w`9p'~K'[Ӂg޻1eayr㗿pj`lP_O=YNzjb@ϟxOw"fbϕxmģgYl9hvV.}#gL]qݭ q;{qulzگ#hQK3+Zw/ɢv[u>ג,?ǍTwBk(M!jU&!O)(TyFzi]5mgtu1(t0Ч g?Wϟ]ܸ @UlJ_;3,9n\O@f;sPy aZ1!*=%?@L 8L`A҇?\iexs:ۉEQKԲDFIm;ڦj< 3:[A%Γ @V3ed`^EVdŬ)|8e4O;fb+''5aPR~MjO+xh%g_*OF`TD8gP<-% IFqv)6˜ٜrʬa 5pB0^|y.G51AC 0^Ћ dm~-)HM*큏Y5_CzkH64-u;;ߴ:@޼Z3"!q8cZa*~q;xfDd 6Qډ銞`HC7N { E 'BdƣTo:6[9ɗĥ z68ܦK Ĥ^c$WʨyQdXֈOed)b#6cJi43TxجX>P$N;Y+U[syu*urD6@O?h:\/-mڲ6pL`d<ӇӢN{[la`d%M$[-p΁4˄᜛D1Ybxgy [od V@7 c:+B+[ݖ;nݿr}? *di$sV;9/nT'yF̧Oyv=)5GȶhBV W|yC#z W|^2>/3M D 0_1 ǧȇE:HtϷh0hl(G4M"퓢.n6kpRѻʬ#}ʽxsZ%OBd8\[ڇt1"`Hm舶\noz?FܚD]-Mf8{4#[tc 02"\ch2ᇉI1Ds!6;TJ%(F*I5BM!SstEuR`΁bbdN·2y?Q͟挴V/{`n㦏!-5;m^\ga.2)|6$5gSXh8ƦB2٧M" 5l9t˾! c&RH{- $S5vyNNNpa\%frɁn;9 0nØ}fQ>-K7{nY[Ee گD Su-1AHf\v^* fQH7BYm<(Ⓗ*)MY_>q􉛥O,}K"BZgٙPcp׌nS]Mcݑg*;CM6ͣFA'0%UR *dsגKC3b【LDky< <-{Jw,D;}aaB\fM*R{#M >~bC5,jʣ:HpL("vFU&fj7>9iq$e17~7rc`M5܃=ʄXRai>}}`32 TrK\5a8uM-E%O,9Q 2`:OKL7q~yB?,Ix7..v* 86!W V|VF}kccTo,PCG "l<-0c›=n0iC"I0hpBqhzqv Zz2b 1swy8u#qn!(s qr9dmYȌ3zfl`R}1m6RuKzy5 cNKN)c!dm̴^/C2V2Ah/cfp ^c59e!.޽9ݽ`7lMzCH5B>h7 īU PWE+Iz, [3[a5>\DhY]Ѷ!cncI)ab>j$EU ڻeU/ U9iGw01!V1T"փ)^}™0Ē0TҼa@ 28 1ViJ+ %H;ރ$Q \OsрrkYVr$ň/J$%Q$%9NșauWh8S!HX2fG=&A|#B3SF.zcXzc5Pu.\5ֻp$gon0>cAlJkéѷ/"?P4{ytZcsOȮuqJe2Z(oit*M%5? ݙ[@ DlŴAqݵ{(6w(aU)ZUIT`nq%(,',O7-=$ϘbHbɵZSiq4Ed?bB'  <iQN(HYd}ɬC}6ZEsKU`Kh0Z`@fl|Y#K-W *wϡGQy\F7 ٻ-Wp<UqVzxرhUhjiiuRFEXIBj#U5uI>k"*myt>G <0>&ܣa}S%7RkJJ48Yk0҄s ŧ#nUqKЪz.1 LyśyNV{є#$76 Y( ;1)_qST60`ՠYm` J oFGCIԛKց"F{Y ,X@<}arm:QT&a,ܛsPdCW_RFfS<z/KAIHJ!&PJT /a/wWn\Rr~2MpIob]jdkRQ!$*ќȀ_#k#+2q3#@ IEh;_7_Gn@`ubyQ~ :vb2Fvj֍ȕ\iJ9N|P;`95u+<1 $G&1>.6G|@ \ՑyJ7xɑ/G$+$0A!  ;Mr'#G+uV bř͇¬iPѻ ")KO)uB쌩fgzX%`. FM7A65SJR/ٝجZ3`WTI geO]@-XG&*U޴.\6ѴШO~&"TFϿ~ }Hn2X cסu%rrВubb );[20}\أkTMq"HGo|å`:i*% ]BFl}M=G#}%0MSj8thZBܤADя/Jу+,҅JP Q% 5W?<Ğc6d0I *jo:)oo`/Qh'vGd\udg~|wްW}Nq3@czO?LFO1H8r龍l#F]~I\t=&ʫҊmQM~2>zS9^IZ"-ܷi7SMLz?맏˒\9IK#'|nJ gBϮ@u4->h[$GW(£{k 7@‡| 'd [8k*qp Z?ʥŀK>3g~ z'ڧR\o %GC-8 Im R&2ijhԕQaezo6$zc]dC+);/fy\F|.˝FF#;uzF?ljx6F:rK]AoyaJfC_9퍣RLҨƣ:TMW8*#8qXFl$밯'?P0z٢x4>4yIx{ 9h?Љh+DO)>u߮x6xcDƇxu |r4NG?<(/B 5b5.W/kw_ñrϦ\ 8OYv(b8jۡaڽ\0n z 7`;FQS D[}vi>xU9%㵝89vųG$Q)^'CX9PԱ\5xy7T=`_V<6r1W;]ǀ'[1I89(BJm;e#MyZ<:¦)=xyIl6@9b?˳p~ޮx6SdF{?~މqR=:FZզ̸IUIUnRTu9ۤPvʮ5l!'9ԅf%yG1-$='h>o~{J+{=JwV<PɈ9@=>>"g~s\թ0NnԹ T3xeٜ:/ȩ\ޮBSN@}4ȶQ$@q&G#6B hO-7o)Ǜ}=u!VjT5 ["RB=M^D ^Ro18p7C6ZKzd4~Ձ#|A#bW1)yѩاX8O*ٶ+ KKzL& s [t=jt xHzQ*B fnqn BuDg[+ZebEHp%%R&τ&biVTWIԳ^ /,q#vw Z~wg)w6~yњᔧY%S~oR.}U)9FPBwagȁ+G~~bJbS F5J3jajj9+ampØjY:. X 9/hzd2_!QhvKr}af'bqs?_Ҷ˻ӓn'&Ǿʄ?MUc>}'ۛ ƒaDG(1lCV"{)bOEBrNC>15=_ l!@? nmvڃ8{Q_˘5~F8ZH`5>~ۋ>c/5aM"0 E=~_}eS&ΨZ>uk=#O}TFSR-r}X9] . @Faht^s+4.Cy$w+`5E7,45S%金K% MVf[:-][s+.m "UyHvvR}9:KӖus$>Ω n-̨{Z۴bA@ '3B-u ʐS?~/1Z܅:_# h7ƚqKlFqٕz6Lu΂ 4GDD sP'OϿq688-`Tᆍߒmֱ<\ Eo,Bټ!` I.2ŭ#Dm^=X]cVj%Ey[W6ZsVm+ɮ$L(%֏SSsj 7T ,J˹QCJAUZ9Rby^~FA+H^Z, 7VƚRZ7|O~o1q[_۽T (ooA Swss77Bn ۺ}QL] /Jg苲#9-rE`%SrmFq4k+`ޥ`$ɎA N3;+5D;M9vkP3/,77F47]R:p,DOIJ'ŏ"@\>p)?' Uڮь-glDέl+pf]%0XZ3GZ8xhyCNWį7hP9~o[%VdnX]x5bȣ0vgUꚌ)5ZM+g5 _ѫ] _yDي :W=}Ԕb-1W#^F+qa@9mJ؟ҳW"RKX7G2I$CO~*Z1&ʘCK[}6C_aG&j>HWcGhc5’W{2$F/tBiXyt;l)KI陞ϫՈWsf//&sg暌)ɩ{xyfA}e¯G@pf ٪/NWd싓ta &QtPR~ehMsTp |'}{+nݗfսj.K^rE` xo؟嘰q[]B'zD{y. Y\͔_#@W#^t,SdeF_ސ?X'4E4s&4h =M[jeMw@D)P6HOuEvFGmsWŢ xafgn!ˌ|S3i7o؛kkյGq6ki+1YgEWc#^;恕5B~vh&coJ-38idDjpU&'hW Ո#`(-G%o]?Zɼ̃ô/>L+lDٱ0 auky0+i]1x&~v=SFJrwڽV^;֞ސ?f;%~h4֥jRg˕e!YvZ'd^ɞ,{@`GD  s'A۠dd0NnNch Eǣ]td:M拎T0.Rd`.0s,nvo:FBޥh:p8a tN;9\NN7b_?"b[LS) =6zщ.Z3K)s$ZO:~70h Jh16f_k05+MvmF 1$m1H泬Bre%A).G f$"Xa.v{!b7E+ )Sb)\)]55=I4mִGL 6X#he^Xu|>a_4i>AȮRS{ƂZb/{I V4eԵ+O!d(2]븍bQ KUNx-ND/?'wA<`||N>-@#:U՛d2MmmֱXa6 ' zw"& gTpTȀj^bLbsǗI6g#3 tF (W:!%IdC4r9',w7 EqJCH -k 618+di[aRw gh]η1h(ă7mkΓe Dsg&g2qW/')r$CM VUC4v⋲/Ɋ>͑ИGopIpE"/(nP[tAeFcWѐ24lR8,~8aLH!1FjᆩAyA:6b =#6QƜ͹a:G߯]#K"u"YU(ֻձtߜ}?KM8<~|W/NΊx%x:vTh8"OSvz]kڹ>e#W괖2|=ՄeG$ggM06wC%S$ h}{|Ql$Va)*oXo7sj~{1ãcwT[xd8Ө"jSXv,3c@sZu8*WGC{|q*qqPػBk{oξi>YZ0hUz2[k{KTC)yp$6+yJ|F+zh?㴣-dtxpn\==aׯt<@lw 4o*bYNBw?s6:=S Z).hB^@B ϖE{,$nѼU"L$X+̀7$e*u֋MŘNs׶fRx@[Ӟ+qkWff#~xx+M},147:vz fէqvwm ˢisZE/Wd˷/;2'TXG'sxV'@v;i&A@dAI9as;.ؠu=K Us${d/SoހyF [W5̹V~W7?hsmU]HQl9(ƒzI?mPzG kc mS,jf(4+>c]j4t6s6JVv0Q icu ̬~lycj`e\kc7I͜k4^v~?;PAg,x;ٷ-?dwV'm'>t@vcg;P«GU h#g`K9`ks9OU]Gy 6u \Ko E{iz1riOL<ͯuL׶~rĥ/1^N~nο5ZB87! D3)>y8d9~M~=-<ӋRoNe37j&Ɂ㛕^ep Ө[hвiP)6jJuü@ڙJhxIJ3~K9؃z|f tldmpm jt49[pL9q[Μ RO-\Sm؝pf ݻ[ق޴_e7W'l F>~&?s~a8XkUjB|n,xT:{j}b<+uٯC~s8 ze!wŴ)Pƣsfv:GJ"ֱ`{9?*rEZOB>k-[Rspl'hq5]oGW~nFC?,o9 pH "/_ E_ ɈaS"9]U]]_o^J/;o7_| Rs7!{t)2Wk9Ag:^ Hu$ b !P{zus#4_#.hTY0E X;%X9$swIX1Ƃg}D*ҋ{wq|Z-Q^/%ZKy>rmwWbEuS^6;Buo>)$x>Mʇ7y2}8-ӇI?:-1t\b}ʫn}ۛW7+D*M#c`$rrqO=JJqsR@*8"kLB툋{~q77`4džk1_%ݑ`yI:čJsH<8T/9e :p<hd5iQZqXbv_ h&8nVXD0봭A'|.8vF6UOOO95&kf.6^ףakh4-ri5>Ǝr4xh(s l rjbtߨKjns)/)Q&8ReS,J;H[9O>[^cQ7Wp25kIj]˭&u'0:}kY:^_ux=Va)@ҽ_u 28#dM r2k2|2ONke'VkU?,ȓnO"K5釒z$)I=^wBYTr0H)W^}ժtP4b-VE > HRi J%}n l<5Ԇ >x"`c}ͶƘ_o;`~/ wwwɖ}*g?yZKw[fqW(]d筦{82)J)6EeNa^2. g}?ڭ;eu֬?uJ) ͫ?cwvQ7YߤoM7&MֳqMDـv*C(u2dD;u X秴w|"/#8N@e~a`wB93cbwWHT I۝}9BEɱq[WAw*T<"<+RՑ zW .ZHΫ= dŠEpˠy>4y* `O: _a6T:&Tyw,s9 Sk1NFW>z0Th.&:B1>}0U FCa, Vr@j  iG.VgyVRo-M)]hR}Wo_QQ'f>E&7D1}%ZĴ)1rIp#By'rIDHFiӯA_{vh Nшv#w)9܁u`$UFdI=!R0b1-RHH!"dzHve7z`N9mÒDN@Jp9 =FQzXbYݘM,1% vcn=;Y7>y>XPfR:Sם1xi _tߔU}r3@+yαgAS` hR\ !U[f i*xSSqB+rDY]Y岗%Rn#ΓkQ$ֲH:r w8/p{j B6E 2*9(ˠHq(,*, l]C4)Mn76cuIGfVq&yswo rvV5yOT_' xoޏ֙H x2itDoo;3}v7hFU)Mpܒ&Қ#,V!NRVŗo>qAGUS$G`ԣBK2IS)V1. ѢP,*ybE`X^i2, FOFpy3E[SF 23#(: _>Kܐt+~y 8k_~Tm89CrPZ P]'ZpD"۬6.EJb!%-=pzJ ڪ~{4j濱{a+hhu5;mηh2-a4ثf IVh5,eW?vƓߚQE?Nx9ŧfg4s1 y/_Ws\#"(VH Ѹ $.װ$~Lt3Gߏrܻ9O(tL}i!(b )a*j,#2c]bͻk2N]li" ׈MBQ )O:׈5 T>lqFryj7vES52/򬆥}Lދ@ ZaIG4#-iuZ.*;:~,Ix7)Ix7$a%J΁i Y:iA`p T1~i@Y.Z]鉔Gkt%DЕF\\Е4a iǮH6_є uIWIr8k3Sk{u XgjZzjaS>?i}%iYmtr^PP]."F/)#[~sg3Jx,2l+s? ]YB!TBuٛs ~jS/1LN 20B0R"?Yk/LP\r7^ߤvd4E^2Ž-bX*}bcY콆e%߆)P.ՆL{=u(V^U`sfЙg(<rZG BzQ1:([`|4хBH(b{V1{SU&ԖM &%E 4}=&xwi>/(>ńj%!<7;g҃%$i r`^$>㚖t"}FO 9Lzp8v_v>y~0_+[SW)^X)90QoCq:IsLԁ3Z@+o=gÞ5ݯv镗UT0Zx„0 &.L86uӀMl'FFf-X 7f0}D YS>K]w6Nti&*Śj 2HXSz0" Y]+"G,K~iу*Ag (+,%2i]ʚb[y[@m0HjRTiœ&9sa@Pވ-pgPLSyɺ\N\A[tEr|uxk5V'/e3!7%_@@n²@@ZU oH;~P>F(-0>*ojů0BuTcZ=ݤ⻛T|wnwuP+4Q L֒iKq2̅VD/s+BƠVeZ(Sc|~Ni h؁`q0FLj#& 45 Gfa&8#HiHH&Yd-I6"S## :MCD RisP\2c%:#+0!rIi**>(We6)nB\mAC F8,2` H 0=k,eŶw Lwt + fz,I[ T<['"XXZL9N=Ɂ4BRaiesj,T2>ĉcV+=ko7Eߏd"A˶08$FJr\`%gdi8⌤*ZW99`ϝf[$Q&6QwJg+?5W |BشVQ=$[$>^"\=E>GzT|W;Up(io. ݅7tWF YAX$,$H* ΍#\:o;\|"R'J"R'?Z=;](+vXҮX9+VRmՈ9p5xmmHCt)Y.(9T,-28()b4yըErE"cwȥH&G:+Y ADQ\(Y9%ާ B#ScR)AF+QnDa2)!IYu鄃Hn#!:=(xm>Vs;Ɓi,Or`k3:TDNt!+l>|yb5Go z5@޸HDLL24Ƥ_zLC 7㙷zЋGko~JJM:)e%3֨pegNScxԩę9U-yDP6n?G'v7XVOv}c|O}rzGw_0%䇪G/(i]yqL~Fᶨ޸ps귤UJI.,Rho>j]ĝ//7_`0msiڳ~auY|*oIfG|D*R'=dL*~"w+x'78bT=:Q'gϱ(8lwk''88'[%RqV)D:@kaPq"^P+J!npNsXU['$ϱᴐc?"t3KlB:EK;,l2.C!d?bqĝ] rJ?젹zcGPY BZf(;ڙֹ#aNpSqY!ljyUDյ*2Dٳ  %iZw]w|)10IC^*Q+𜥜'ȑD&w5n: ng݄UQ 񮒣OQn8D  9U&BOe&BM +4B6*M?}t~7ͧ|RԼ0Qb}Kb"ǡSOUB[ ϯ87O$wp (pQ7^6R0+Kp=0pMTn ~,65J,`/7wfqL~dҟݩ$x<Fwt5 ֈ}|km!8Y@G>92Ph;lVc޼xS*oEsfgw ŲW`< UjAVBΎ$=C*ϲ@m] CTKEKE"RHTTKpŚ\/QI;% Fj2)\0>EEp} ]0wU]b3r 3'Ć`ιI C4C9qZm,:QR ɏT* F.-_<"B@ Uy(v"t![~.LEU!L⭣x**N)z-q;ki^LZSZ˧B;I;[5v4tʇK`/v:liK11#Ș`&9 d-5 S e;㩨 zAJ9tS!wT/ )=N! u?`^ʉPqKRj!Y|U⫒Zr.K0G;(W:s~XjʁEu?5R/ʭ[ONG&݄,$T2[UZ !Y]s+q68++d6ٗ{>;ųȥ'7vֳ٫^.F ^(vczЙIgsLaF̗fLY54W$/( *j!1d`@V@Bc&,Ri-ߑYJ66)ت=1cmg C P>JAb | m藀y?doVBSĔun;FGᾺu3 9zd(ә,g?O&^WOrF6h$g~QA %8$~20,QTiiʤaj-1DBsoeQ/ӳ\̽ ꀼ+V<&Qʄ2 T)J}e™@*E-k]].ڕuʮokgF}|`0:os]eΚ @4ޭ(,-pns%4Do-^SNW"Xdƺ l9f˥=W@ s#UT=[U ~Q/|ǫ[mҐbw^wދi;§pZ\(X`|DH @ w$Cx 1f:탱wRMZfwΩ#r1G $PB"22)䥭}-GsB5'IEH33)re@F~ b%+V 1ǜR84nȜfB2乿{DKZ%.! ĩߜI+ HRb&yp=C;4K!․EVf@4CTnEgd2!E&lW҄,EQ2@`+,EjB,z#2:6'cG.kbE˃/R&V*G4JĜhJ*}< y5qq=f ظDOel+c$B-_ml8A]:gDUj^2 6KZŭ(1hcyFBP*Pyjp,ׅ#y]0 |Pw ?W-"',R]ʹx_{ {Ź7Gb{zldLrҷmn 9Ze)F# rG9h :VcLs {*N4Vr_fK*/qrs*<8 пK 6gyγ#'8y&5 fSTYH H,oz]{M/O.9Ş۠HDP<]WS$>Ȋ1#`O];*]^uIjQom9̻狤∧ ~|mo%ANמ} 9H ㆧkj"Ϯ0쐋Tˌ-ͻ7*!SYs8=}ئ'P{ę-"3!Kv h9o~oՃG?2g\r6"2nk{q E ǽ-c ͳ[X0-.ow,R3WՌ+)/2TVp19Ix.Im YuV'`%BV:*u5*lzZM.MJfXt9lW,m^ڙquY*ϙ% 5Yڠ/^k?dzFS3bn1\ 9TBN3k ŘөI*X>9wq n1Zn(Vwp/eq^f]y 0Qg o\(^Wݤt"*,h8\%Tw'ZC!JCqm9}Vxi@A0]Zΐ*I?j[3U1?U?oU"zU %MX P]nIZȄ_SZo~#NԃcAPsh~ ?2,*qWڶYqz+3N,vz'S}{A?-D 7 K]1(琏y>I$ )cJTpxzU"F$!0֑(U\I ×ɖÈ[^|C*'AJ@۰;5PNCSҡ}]?Iيk iŭDY6 dREđg*L~󷘹vxKFl3rJbqџ?Mw(]=N鶯ݼT~a$/,tv8G,,]ȥW1MJt9+FsSjC*rfcR+䥇W5"D4"8%_$ꨭ1("mE/_Cv d\}g81$y0æSĄ髎 OϣpѺr.3ev<yq`ӟyҌ'oqS| ܟy9#@i`H !BSdPCb,GRy)ހ9L%H(`C᡽M/z=xmռNVs+2LW& dfrZ6H߇K] HCۛ+aŇo{fo33=< `BMƝ5bi%m܅?Tą ?夸9"28 %0y`/bQPzi8˴y>Es B w!SrƭJ_aqɑgm|Bz&z Lq8u0j{&t~{, v%5ۧzuaej#gX6r: v.tdcntwVJʇ\Z$*Ϸ.˳WW`^rR6 )_$19Z^:<8xwuf/=vzݷ7f|7ŠwfOt~/kuy?{t9 @})/K?Ga J@{{l9vkג6$ ^qbLHJEaLlNC݇ /JvkMqC>²L-vbw嘔V=f!}Lbo;bK0AL mŠ,CD.l}@Q7J@laZX>?}hcam, +*+[!S'@̤$)Њ#>DʌNha+v-IqYFxR#Bdo0MQl '1!J2JȆ 8 y(E$l}i4֝f8'(帝h))"UNAv}`U$1:b"3 8\' [ٙ>R>j^PW݆[K$$'֒l-֒,^>P&Sc& &660( "J$"̔ܚe'<[C^p@?k28ie;"[+toZ;@}&74Fဇq0CYQ"4T聭P6INal?=/8cݜs6mv;t\*9 ӯs̯J0sFyy>c}\0nmM?g5R4麅;!)݋pW1?6T,{Ƅ'σ9&kպNjK078cjY({n0+a`\ v^5m t:s~'P BG76Zf#kݝs&f!e ՠ6[[AR/B3i#Kո*2*9t[nnZ$&K7pX**g1 ~UHTf>Z`U;44xE )C^8deglųS=|hRdZhłtrv5ZIBVcbua慜i6,?|:]sw̕L;O6Xu9ۈ9k]|MHBSb0dIOuĢS˘ôDa0SMӄ$ T$ZԤ בh\ QT߶5Y$~iN$y0C^4^N: 1KI״wvgR'_0k&\l8!Kni0{FɊ{pN!漸֣ǭ3nkiZƣ7h3>W$vY!8ȉm酹R5Zzbf 'yV 1+Оk;9R~}&56,WK뇓 ŊCq 5 ߻Ϟt{!:Ct`AqtQg~82yNC8:BiJXH )CJJƦE6QPa b1 ? jK"C㘢 ؽ1k;gd¿酏S8-۷ ؛C-5 8i˜A  5ԧ73H2H fHJcdhȐFQCUL"nT#f0aV0PBS) &")PH0R!&(^+%+Y%%z e+tmsɛ҈q}ce2)~*kV~Y)1 Utm`0Pi4tH#ƚ# &<7U2M TL |xtϋRk7MpEE^`SD}Wc" I$Aσu pDG1:B(I1'h;8Tb -edOé_OԏSB^DĉF^\fMy _\WyD'unD=*~1:rPTο7ǓiN,q._B=p2D|31}gb+R0 KQeJbpYU>xkW)UJZ^D!͛=7Bg4J G4C;5]z"`@]8ۺj۶Xr418Et f<Id“c/MW^{<,&,행|Xx=e0q8#W88kG$ |("L&}=[_ݽY)M8Gf\38Weݽ9+IdCyŐ_J@1AXl‚W'Ϋ+8q}Ab^aoW![Sg0>6 6ӷZr#Bnz JKV; U_bA̗R8:eu.f SvoMgc2' {YR̫mrl:B8NGFCtYO1FLx>=su3+Dzϲ..h 07E_;#c'0 a%z=Ol[>GOWWf.SAwLI0`񼔘4#iy+K?&, 6zdyFa+ | 0.mvNp/W@vo3uu?3 &f-ş|X_]!TW|5TRZAo5YL-/7c!xb^0U# )>*$]|zcWT˰%IRDvSѸ$)T4ݚƕVqch\)T!`Ѹmh\Vz,F]8F;%Mu+}AS (i,"G`M™8T ^X~VwS],F0g? !f#{>Kqkc>t^0?U}V=﯊44$7bƗ5&E\^àZcVݥPKsktPڽ$UcTiD̯7RK 0s xdy~~9*ܬ|ul Zb%No/>@hgj*F!71RQC][u:.)FnZŢ>>;l;.Èks؄+ΨVRH IF4:%- I31GU;ׇ1e*9=2Vqptovigm*w̧41sc5| uVvk<.ik5oɵkHi鎧ݛzb\;;in;y2?;ntv yh*X5JuS*N%˜`[bll~ cQZmf9~.C~QNJiuJ;+)?R㐞N^buz:f4Str@6:wLŏP?CIdSD8mġQMzJ}o1LyJy'4l PA*eW+NI9mE24Z)B}ON͋NnA\/I[W[~ J6nKi'2?hը**^ޯFg1}%.תKQ_RTMU]? .Od0&`A6rm2!,ME4 Qp͏+UcRukv8Y|3փ.ߤ0vn@g^5wyoD MsoEBBYs%5c_{u_:dfl]$~s~qٔIK${L^AzeC_C^kK!]rϤkӭǢ.<6jg=:C \^8 VI2-F-B9+6d^QS{Y=%Z&r"69I i"Ȅ̐tw0Z'29_1 uv6F@EٓRRK VGX a6&mA"1dJ&q5峃WN! ED?m=)唖m Qz. bBhCB/1p:`%-!X̀ɣ=RŎ+6L>9}W7Δ؍@[+qatHLK mll"K2IoNTf(~ccj8=ŦYrΥ4X{|~)f--52j!%L1h+ZVD~*C .gAO)pvVM.{>AA(7t$7lJ {(2n=R g_3_C߾.{hovw? (u $ bWK^`LQ+$Zm#b"l˺tKFd1 ߇4_Ԃ%6uCLze_km^(#n\Ȓ_9Ĥ3@q$vGoN`+TFq(|؆x:!GT<]e m0t%^QI9tteN0٥5.;poUh0k|jt8)fE)Z!jgM 1YjlRuK+/c*^3)LFCc.?eEPu PV; gʡ'_Y&DE"dsԶ #B9E_dEmCF|d|־萑1ց}*kkؑ&0s=j(2lETJRpBK*A M웍g·"C[n%;(7utS (Ѭu4́s $RmJ!MF&O&\f/BG(GC2ǀ$cJcB6)h}lihmK0cM%E7%udpwhR!Ǚ{E+s$gKƶͷ>Ix413cR*d#BLUAmMLLK)SGWhVd'*jQUhyql3\GD^B_ 8@r=.Zr@KQ’+/Sr-HT.w©NJ J(zWa# 봠T9]kXQv3(4v ^ +N 8j Ƅ\!]!YBtLm-?)d%-z"&^UBF I BkR$ ʶ$Amͤ,)0PZ'!gGCCF`ˆ/ܢ!٨awC&hpR(pVKUld*2t zv[tk9;PU3;$c'q;g~+ 8eEF4Vwo*{<2+ZV0Xu 8w[J,Y=Z~1#>0wKc:3%z&Z9%6Ut՜5Zc K bm9Dciڒדаi8MֆG!fȱr\imx L_y^w 8:Q`/m SO_9i{BSN8Or)p7 ,t ޤ8M>wVT@oEٶYHMDU#rbඦA(E-Ͻ/=K5+oڔm#dM:`(M˜!*IײsM#& _zle>8&RT.`cfӂJ(\DN6/[&25F;67LA2;e6ǼEpϻwPs} %_#@R[S3BIrwaI)m3*#"y -)SJ\mK5iw>Kq'{jzu"*- V.?+ҫaeX݀O^2~޷sz[)T\6/?}W Ͽ|:D&!8;}zDlcMx z%&> CY#pmM%H)he4"Q ]h#f"25vCz{)}zj5}zKb??Ԡ6Am$s:|`_ sƌK1Nщ2 m+Cڂn\/=;jsz]^=Ya_*Iו\r>^m+M'/Oϫm{b1Z6FTahЩ|UcIKjr]2ChH)UYZ29i|&b ayH<ӗ`&LKؘ+ixhZ7[ zi 69W5)<-?{mR}NӦM l&#$KJZ$bwئZpK|"RI'g1"1u}6_UQ2e!1kj-.XXeLRhEOX-HQ z1Va]N(kR#f 94;a!yb>3$@ƽgX&ߨסJ{(IAdLHHo1Uvs|A $\(,Tpk H!  YW҄hnV쁚 j@ʆzڲNs:ZaZ +/ҡpSslcȻBR,5Vp8aGz RjF,dIU5dkL z% \N71e IpieF8,E6qX \k ڑ,Xl`AqĕkH5kT`x,V[X߈34r*0lP(ŹGN IiάUᬳ>Ϛ+Ta(b(@ CB=ФXІj !$X#,3ZR^ "]Ge<)kbMV"j8 7k>z'yҜrq~JQ[GX.RfWwdv \ )V.֦^iVe&1xo٬m?ga'1{fyC6NڕJf*Qpg;ҳnb2V~Iu[ -Nzd:Zi+$Z`0Us9˧{Gq4ZʑFthsK+ӽ?pv̬eGYd*~RT+_Vz"ISL\ od",>KhJZ7zy9U{w$1Dƹ%YJsʤtwM66sðaܞތ+ソKX9WcZiC֓=c CRHKTQw}Ϟ=""4wewEXa ɱShZSx_?`Fk:cm'V}ZʺOzxu0Svŭb`hmJ^жBI\ն )p^џlIwۀILs!5 'OiB҄a]|52̗jo3_oU|L_v`gAx6`uϒ^n;ecSͣ㛹DH3s_9J ! H#QTX+u!vQ(dc7PE_g\ˀV*Zb$vN͙uUKe+4ZE5毰 =}+dnۘJ9c|H^=^_\=}y^Z|Mm8g\ $L/sLhHi|w"k%ѿVq  ȤBKI>48gzĩ]L&=&5?GOiu_QMV[nm=Ӹ:ETSԼqȆ~۫%sqMnM39떻er=խ0 !^RZ&>-)S ;jg)o'^'{u(R)c8%Np"@Dc`$!;x `aEo qsf-92kP9Uxȵk"(Hp{NCҢ0%rwG*-߼~ݎz5{^n.WKuna(Hvmq4xKl/UL^Gx`z;k}>*DHDuP4 qHo)8[KTiD?3@嶲w&*'G3$z&a$ IBR1aA3 ېT3JKg"0::iBX*)-n,t*9-;@kgo6Tb4'a$\n hue/ {O`$_iSv_ei"5މ ͓ErP$AG}&3TErGPJ*`D pTGubU)dMIU鸔M׿Q3U۽;(Z:rQd%ZLz ]Bw*uawSs$2gf7WYOqr3eឦ7'?sjvK:T)&P^1ߙۆ8X+폂U(o|5aͷ ,Zfs! BOoM r95QU'o_Vcz-,̢RBYN&KX_]~1g2:t͐2Y|l?-?9v q]Ɩ)Jz4M7gg, j%Cm_HVS Bǐ^jzmI`Nd>EHi\YwR ^=7e(h˹U|(km!WX8cʭJXX^V UmӲ?{'!;gł18.FE l#67̚iFŊ`V͋>%L >0~XﻶS k .pGWQ` ̺ZEG}x BXѮvl!\(o*0nwg\@4o:H:- A҆XA0!nV\!ˑNre'ʂŸ́z?-`J]2T1,{5qVɲ=?sO VCGMZ\*6{ZyFbRs9/]Ԙ8Ay$ 52cNeRTШ{^I+zMʸ;1 ԑkPi6\?s[_zZA$ے`d`yXJ)BbY ЄPzɸG$@q3Pq,NpT3R0 S}\VADݢǣYb㾗w:=*uN1QD/[W@ľڎQ%N\$6^ak_TrӫniqVBNmF^k"0rS-b/OO@qh1 :OySIZ@R"l\G %$iq7;5g@L N gc XؖcQ_i . e~o$Vjc,G]p>FGjۛW}FHm#),n _}e W 1\zMn.{AZ Ɉ7xGEЄ c. Jjlߑw%>ћ VQ\5s:ŖwCYќâs7 ObÛbޯ,6'Tmř\.a\Dԛ(wߑYGq^5g܂JyqrΦ{,.7!^/?DSζ7zvf{*ӟ?o<__+۪|o'y}rTN O-o1A`xc%%L88/p*2  [N~/`Lb=F+Ù 7Tр Y&$"5aOF߱ ԰⤃*1uQϖĕ[-m<7L/$4PAx,^trXPmFPRI-$m/דfY (R"Zr1<(vT$1Fq 猧Byyʉ`krƨ|hf;AY3LTp#.)Y)gf!# J33Opr%*}`6uw{Cݧ} 4VOYޚ)}|9ߐ7?<0/_ 츅WA?џ7៯N@!+{{Wa8o""+35ŷ+E"wnV\5<'9{RJۣ?aBN9aAWbߧ; GơTl H:X 2-XkDH8x-)*2f`M $\ӵxW;֒HaAP KGŨY ?&6H A@Ā(8, (:qJNfsLT SLeNͪ hUpx- gԭ[GoF]wTnZoVwǏvy{迼^7^A˙Gˋ?/|103,_^z޲4vn@*5|l)=o/*P.5$+,Rdg&BCry":cwEL+]ݲ Mڐ\DdJkۻ҃ry":ciE]Izw&4uwkCBrm,S +w7:Q_Q =t}e 1WDj8Ԃ!XRTk9 [@85L% |V*8-* &Q_eE}!8QK3hbHa=9D@7bi&恭?ƃ 7n3jX"U@PUpN8@JX7M .U^Z(| Ja`8$Sp:, DPp{89PD_cv=?3ҩےDWCwlmxb q4*M%,-')=<ZW\a;Q L`ў ܁Zxʳy5窰RZ6v1@'NN9zfxZ$SA)o$N&Bh EY{4S2|]V`J5WAK-hY:XJ/ӫT6sS ;JRL[׫XV1::Sv  T,`25|b,:zH'Jwo^4BҨ9L`+"K Ú tcq2A%Z'OPz<"xk-5-,U԰(0_n>fi.$MC'|q =NƺޡJ ;3;PÈD1@qی\ Aˍ̵rcwQM qYϻҧRŷ؛JRL轛· )0̜8z>dک~#$|her)o/LG0U_ӱUe/#~ mKGoZ*5:xA67HFΧ D/rc`W_{*rKpDJN %Ka^iƭsеrD. `̰,:փa RHV/ߍƠgEªL ]xMi$hѓF(a4MOmM| ܴ|boR^|Ѻsdzc0㍶ЄE+N$=# *}}(/m[J䊻/8ݡ B4C]=ѫ;x ,;ImńӋm!X|M/A~zfW"OQ/3^itI>I!_~\y fUaO <{fWGF1qQ$s) tiiPXi+@]#U€ MEEad@,!Wc,A@ 섒 "%EjHnƬ" xEEڡB0~MO/l5 /n :PRo*e4;,\X x΄ʩU!D0awLCEXy͕Sy@a $85Ugo8+]_eZVkٓH(yH0Ez)X $@|Z]Դ.\}7Z۟NvtXV`Bf0!Nn+C|5+ ?s+ax*r3Y̬f=3+=\̧딹BZ0ƱP_ ,FRᙴjoԛܓLe[<ړK! 7%ł**>lz*"< *jSI)K$qKs~ ʈ {!%!*iA_`I;h&RCg%JKNMs }r.PP-J|)9fڻ|`xX->oq9= ɓd2pG%kvƮIڗ`gՐC&ظ>5=>k=t6_"/N>1:( T2 'NJ"ܼPT=Vd|Yk&ſCNqWŚAvDख़Ř5~N\lPFkB7f:@=UCW&Hu9)$!;u1?Ŀd4hzJZ mz(ۥW(M/-P,X*jP@@umO)?o8̌5< C'.;f6st sweuS+-xar2Ё;donB[W I f &Ka܂SѦZ֮mk׶k]M[;L*2#Ç.vĖY=NK,s7L\~v6 h<>]8iMcq?[,t߾w"-|w>;&0M|hBTN{eSEqVef0}fMؘƀB\:vq!,bK]&f.Hpí FNVVv8XٸXEf+"QD"wVqq@_R%Hub,`T"f^+})NkQL.7[@_tGj,3<#/Dց/[ F]缠 KWYF0X:82Z 3Z9 >cTΟ%M!!%*+%u~g.H& 񏋴G 3ȓ?Z&opVeB֕mV5Ú^*.V L#B$QS̝@q8q\r!COsڤH2tHe+SKs,#$ϵ&Z24„J#q 5zvpo=ɸ&-&M, zDII[CxCH*f|mvFK5 (o\JWף_SCҭ- \X)jHy& PIJy,*lDSfYfM E ֬]YƊiՋ+rJ3zƓQ~w)Y0&uysJ8E-r +j3c1NyaZT(v!fi23k ~IOeQDg*HE^a@ `- m|"|&N"w>~>zAg\5CѢe[xf^pf'dTxX! 'ra(qvDaіӊ36Cp\nF7G'6ikBZr1cGm|:qt; a9#b)0 ,C"Q ~`G"b~G]z׼ *z}  :K)W[jLbԙ=8{ӈ(d+Q-1Lnq)6~0zlu2ʍ'nj; Ia|P 7=Bh>gq` q-&?Wmܕr?q``Hw(&dɢwKx1  {k -sY! x2 lHWQu E2+9fy_d}ᬼ}9k6x!*B L{Ћk]e_ppWdΙ· q0 f>d`wsw5*7c]>}8,0pFwe&Ln]N[x tQ8, '@ȈJy+  x5=.L-8|O7% `)KH35 zۚkZ C'ɰnГZ`y5q>$ ~b]AN| )3smىUᱞچVy]:[Cq!..uJ.L?Og;E驘s<+"/Gfi4vgP!Z@-!`GaǐWϻ&-C8t /9_?-盗A4Y_6_t_ 掐b`kN!wg`z7~uq@PW\+x?Âݻ*iwO6:xm li|dh>pp}5N:kIqL9m,V 7t{E=Ou B {J,4;p"Rڼ4%v/穑*K/]Z@  ˆR\C>}g{\FH8{` %߈W7GS{Y}?ODБ7Q(6XkemKЁTBp,rЎDkjt9Z""L859_mfpF7,z|~tRvcau%Br0McCE_%NX & @,Qq0ĜK@,W&2b-i" XdXl]cŘk{Y<+}#d,hPUW(78 0QqSAjH^Q3KkSt̅T`4}W( RԆ\)&tmN\< WsFwEJHQ*;W6StD+Uka 0ſ\u2JH{dO`:zQ@A)`#Q܏ƙجQ3!-TU5j%c0y Ts4ElA%v epʿ2daC,H|} (vmNכ1Rh8 5P͋KY[`Ɣ_u&=6,DnfdT"TZNЕ%D9 fAe?NٞSq] Q+(>OfɪҬOcE b|E| d3rhMA-x&: sdKm.4B6͑4 ;PWXACI uO.&VۨR!~~_z mE<:։hVT@*S`HbJj)gy #ָe!p=WR#RVLFHٞ|41K/i:4|)zTd\T&BNi"ّϻ"0I(!W`WiCw煕ᴊBv&Pf/9+5AG!ftQWO+-Uz-a=!pRLA:?aq%byUxP̹N%[rXCa3#iPOoSlSnWj_#tR:Н(vg_|{i^O+ߟ>Mo4"h 4;%S: !ǯpm$-I; QNK~?Y抝ce HljMx7!{o#9 c1<(0_r07xoDk_|ph>?wux|rl]\u>R6*eA,_~h<./Q56K(AoQŠ(\/ǿbYYeη^*KQH`ͦYû_,y} 0eFPLLz Űg`y;fS/gK{t ,z Ug-x+/ZmFUǒREly_;S/St{{M2dbF밽ĊӶ'?\C&_xtO mKqDaU~GkEeQ3U!Nz_6M0-$qzTOBPF5D3n{6wk 1\0E]CMq1޲Cp͔oyRǼkZ<$+9$sχxTS䰪>1d0j4 a5n0 j^34خ{,˰eWK#H^h XkQ?Ou[`A9`\55i5y=nk:z0^]}0) 3N_emu^4b S2 f7&+0ws̻0ctGdO}XsU3y&w_|EnWg` AZTw_JcQa[r !ÿduG>B¬%CL$Nq;6zmz+),0[j{CL7kzod&HA>i씓 |բr61X 3CfKCIcc(E3N>NhS^~ i_ u[ o{ /S]Ͱ ^鹆`Zf:㖚!%swcI1)ʇƯ(`)9k[: mD̝ Y~}L`&۸ˏA}yLi5:-B >fS;#at4 XUtl"TLŸ"}/gzM{{c3@6}f?f?T oķg쇷8'a#jgX\ݿݫ|VC,b-*;^O< C!"q$d0[ABE DBՏ"Ӊٛq&{Igbfv87߰\M}, >P+"˝/N7l-1qH{ߛn^U7?ObwYA!^tMj.& 8T_ ውDšyJx0#җTA(f:d"APL0}6˜`x1eޕvv iMݶuK{"K5P``hœ H 4y=cA1g06=x3>8yyt䘠H@!pv S]CCОCU:f͝M#WQϤ!7AMq'ͬxIB $g'EܪU@ "Ay5 ǘ 65NyB # doj`W =6 h_mO(Fn3WGUBƹ-z+:ۅ@. ›6pNZPB5RqD5{涍 /_\HsC֦6Y8ORĘ"Zzx Hr\@_ӇUaÑpXR&4P *"}zDPlO!ט5([ փ<$ԍ:IY##;-#(+@{>& Do:g1ڞ}l A:8gVL վ5빧'j><~7˭ǡK`8tʰml^3bgkd Xi0qdlcmx\-\5UikX3KL Q`o,$61,1X T2͕Ub߫miR,hnjD@ TVX1Ruxkjc-B-)a6%ZZQ>Tn!9 xJXdDGA*ƒ-UKV278k@-94L Nt;"9w=TN=Qʹ  aؖs9De^]ʏ(͸&ĞJ\oWHkˇԸ?Da-Up ;?Gk G|)a#c$ŕ[l`t[Xܩx)g2Щɺ b59B+!t ^oq薽{?Fu}4w_#BtEm'Du'J`ɩʄT!m~RG!j;L65AG_js cGJlXMpvFg&q n=\2N#YN?z}wqEp,R{Tt$RذXpCy"[E0i;)H[VtS,U(Vlt|F%s*q"%eԍv)IxP5Y*ui,K/w}ɔ% -tt&r%:\R$0:N $bIAU)Eېǹ% J[)9asf6NfAS0Kc+Jml$Mâ)N⹹M,}AKMWP/.z_YzEm[\ށb^ol9g);3^y!!#rL-GvLNm׺DRªmݨ=K1R_.)鏳p84yW~p'#(l{HYe $ԙ]Ry)J(Q,;Z]Zw$Y)Ui#ɪ;[Wo>"nG7@qgBPQ8,|.,iE2.o1K<4n{f(!؁#8U!(-6wBܑL4zvk.ds7[LUm_[^7'4*;nO&R`Һ!;g` X1 Bulb'*7uU޾gm3Lm3;ι}\Rm*%owwOq `JvԲRyTjۮb,']:/FF(u95ZUed~o mrbpjhc20鍧gg&/n%*5[$ ūVI>O\yVⵃ›}i9B;wu;-d<"cyހJfe"yQζiŦ3Qck0<1Zg܄BtXX2 xRHdԷOYfE?矒x257;d_^-a^N.6=[wUrvW;lyvl[gT{s`o$?a&cm, gC9%-s9jnaiX`v=6:|EQG؂Hs<Ddm'G@cY{u_aٞ?~ 3Jٕ@@cT>z\⻊]9#e\//~܃llŬ;;Lc_cɔ6&1LDC1h U,Fw6ワ!鑳, !V#eSAJ. [,Ȉ#&@LbiBSCPK?{CR {~OS0_<\ϧཚƳ;,s6S?F,f`m,-޲$3s..)B0)X()T2 %e/Vy +Nk+*/Z_o˞e5ܤl? WL?,(jy#,Q(2!pfԊ$ R"MR,u(A,KSXOb̔)P8J謔 d*g IIx,%h/Aq#kP{6MZ ۓ#M50 "iԒ$K2%8V')Jb,N1JERSfL"Le3=QU6Fg1(0x BTT45bIM,b&`Mj@WC*Bla m6y@PTEPb.ZWݸ#f>][ k&qAEݽE|}ӛ\tAܔϯf#^G{?!ظ὿!{+|,> `,_S{3g Fٝw 0m9n oЮ' xn?[dgːuFC^B].zQI@ OJ%7I8⊳Batv v/1AwdD 5Jp|[`m3a mۂʻ7#>q0%77[͵"{+2a/afWs30imQt(FeRȹ#d~(mM4(+w*?p=R*VF\rݢzR)3J.`n~t^JW5.I9a5Y9K gPH{$WLlҠ _jd(iwk &&at ?{Wܶ G_89$mχA[7zq%*ir$SI12$@bvw~qu>_ڑݦhYltyMty3u3IMtk(5sm(&t*Kȃw퍮o}>ʑ;ө?L*P%;>m#v)#e53w&re#8м}(3 I7.!2!Z5~dԲ:9>G|BI ܙN{&$ѣe c!&8}|K|{pWr: !bY.i{ؓeC g,}d*ls竻}d-e Jg:-9 Ew3m;1(nnva&tGMp.!]m޷qPLiwU1\h`oDIU0*x_0ʲJ@{a:~ t~wPp ,*y6l<9l,[watQzw~B$.;A5[ͱMzVɗ$>m@Lp #Nn5wTn9CIz}-H\I"$wIj .  f 77͋,I*ՠ;4\3Xĥ^kGk[U6sld8h>X*ܖqޚXys&|DZ iX`Œ @ŲxO\9΂ud!fHjQe!YlmAKQr\I)5B N<9=bkf8|g#Cj~Q^m=;sk[{Xs-t[N9:^|cڮ3TڕߌT rhكb'{75,l`;?,`K@sIЖ: M&{2Ύ&0g^у!c1;{ʍO0Jd[|08}x*]5ogU㳊,D f+8 DgSohӀW6f@B[b/5a9G@9;X Hfy%eλL7V[@DqEB?Qj\'3 /-b8%{#q] *)m*Zyq(@TKש#b>j5N3+"ЭCN:=t[CpSwrˆsq Q#'@*0f3$J Bە(ѿ5o |WKW(7" *Zn!lp%.{]bw%Svޞ5} 5ߖQ2۔ )Ĵ*\ b @#&)4K T* !dX1Y&Yv-J$+_1g#cYQb  R-t$i3+ΔV+,*ռc;ˇǐ+ \2(evF8Le,Ә,ADgi J3}S9Cn%h')Ă&˘t6b2ߎ/[Ae}ǂema&njTtg n;<?ZRe՞'XXY'/NDfXpf4wy>TqQ,$N~&Kw Jq(jI sL0* . ْLdX~T v(&{iX @X&X`-bB0F1H+& Pm`cR,4 ‰8˘,<4mTIbB@1::v^!c;"g-< $*cbXpֺVdQDxvuG4aܘ,)Vȩ*+$CB Z % U$L4'AWO4*A3ޫ?<^kI^W#!EXsܮBłd1jEG#ՌB ++)( *$5)k!ʅˑR0ˈInPHLЪ̭Jy穤DP@C/RۺxxmaFqW)_γ̲xV5sI՟k4hy;H1"I%7!Q+B8-Oh=eű;}==$U]%_GýR >lޥ@ZI!.VqAn{W5TBjUk6 -=E`vx_}ZH1{KLmWͩͩuw0ڜ(o7aJ{#Z\CF".mm7aBfɆי^~O8a@$=}X Xz{ЋnpKQ<Ӡ:ߣՉkKԉ:9kmk 5~f~k ~c~Iʐu{*Cl|t&smX~{=͗N_ԥxg_e׌iؗńbʢxyz(47I_3)> akF73/kj˥%JKb&e ۪J' eLWJ 5f?mbpOM:͙:E'N=u%t9k)ݱ&0hYwv(V}s51ܨL?1Ky?ܴj&^E&uz;?)[V3hZd;wQuoٴ~eJGvBzY\l.Y2%oylj7qC'ry":cnmcELg}klBS[ y"%S{K C?o9X|8Q6U%ilBSq y"%S6δϘu0"gBe"~TĚnΗS.{۟#5_^O6{(QeZh5ϒ;`E@e 69kWAA , ,AiS -(f*,#WH 9xcr5kh9˃?$w6B\;r y"#S[M4u-);F6[6ڭp͒) ;j˽gMaLo6kȀam*Zo.ZfM@h&0}ODR:3k k|\1Y\.+fr=9 NbXJOzb^QSz9%~;xda?F|J0AO*-( o(A<'اՓy7 fvTVhF" A)qHE#)/ڀxV5N*̘u ɡ;땤$䅋hL5ry":cn#FFӷv&4U.Y2E{-~,A EtrhF"r4lBS[ y"#SkEj[g(lh5kEA ,αtj8VND')P:|lt>i3R6k‘E@BdLB`'mmaXXSaK)>k$%ts֌҃HAR`9FV}벋 ].2%Ւli7Q Etrh[6ڭp͑)U qni7li>vAFEHE[eJH ,]kV(>kL,[de*J,sP}V (l2gT*L`6aZ )#1LzN[B{b9D 7JĖFɀh0`1_{[ scAR1ZOr,pwTKۧ( !/\DdkfoKJ.D'sVmߜE;ӻh{L< Ӳp |#>9mr֞`ԊɸY(.=vN= =a\EUp.uŜi;zxs&FE‚.b⽻ZP@9$Iޡn~TW^PZ?@~P7na( ! R O&Fc/Cb1`9H u`ݍsYuZnR _MwI2٭~ŚcV..bw¨ (3 ىjy]NK͂.a;Mh cA5ol`%탴{i}y"#QUnKog VX -"wXwk^r'j}78*9*Np0 ?nswZX*)»*ŝYpQ{^;Th2(=AʝςVn>aYx 7#Y+un!Wc$rJ-so=nj0$IA )Qp`Fx C"j&uTMAgpbOMhɷo6ل,`>5اLD6k3­FI>}w2 D提@hWg>Z2X[I\Urb߻kz_;dqp+һH&TaoLOg^I<3Ars1 UC4Q~xF)Zt;,^6"Hsۈ`1cV> H5w{ޏZ =Z!v/^lAwa8=K Oi<+ &nYSvb$/PC":7*|$%9O9)18D羏E`)#Q82DIhBƌF1Do1@Alv ]kdoe͚t"vG&mrG%Fcl71h޿/Ox3oC&!B]3@"6c?Y;j[>~}ϔzfCa@Lur/C"}NV8 ; i\,IZ2;gߪeo$q I؆5j$n:nʹy(J’a+ml;9I3mp[Բ$ueu<׶ MX8>XJf1@)aIX$$1 ~QH}H#p@`%!`$$]%e!!Xq=`lRi_a30vHY3 (KfwJZH/p/a=#,"|^Ix-cY8c!y7%Lz d2d,eAj gfl9@c_$ 4N @q0VFr63͎ŪE^g(f2tvD:QGuay9 =񇇖u '& :qv}=/0_ȘuTjQ;>R2rw6i2D yon6ޭ|y RG_p(H+Dy+9)t{_Oڠ@*Zs2qb'Tq Ab,c 8"]60'^t^?V; {t ~E%c LdT 9T@Yn(#: BY !<P_l $ڰfD`q [6c<52>O]0brO΢Y{X+፵Oüq@r*StW-`ǯ:(.o޾ UVͲ`/}j⧳r|W~*| 7濹P?-8%Crwu5u8a<3!)lN1aMЛV%W|$g5U^s^X[J?v&@"ѳؒP,(1f !Y0|^PEDWL0(^DWGbO% 3JV *||qQ zZkܦn Bd@׭/b:m {XD%U`F'J,pr 2]0a_Mae}%P7:)pD\!Ň"/rRN =e6L6RNZ$.%WO߹sƹ>V)2&!!| NN[;Ҝu0@՛"-Zj`k>TaMe)o FPʙ9*1 S쒖P ܐZQZ _m-P٤-\K4d GzVƬJ\PBoT7YedZdJC8k BKQQłY/p # *5;爑K˙k Јwte*awfdc45°FڙcxpO-Ίbc "Zx~=5 X96QbIs ((7 piDDCJG o jY;Qb Hhf1AIrXWmGF{]5B87&:7y;7U`ua):Z86pa,Z;`k<67tFyMߓ\qy! PKxB= 2RI xF0s)ă'[ ?`A(p2`'4L-^06aϬVi  $၏t%6AQWICIPG #0BE42ʁu ̦!0ʽ<*zЂpҫ@ ! At FS&08?Gf@a;7ppAxZ9Dq{!uA@A-H2Hu\QS$2dxb^aS$FS$y)wZ[7+h( !.J%U (! "YNJ8LeC^'hdSRȋ,z2P"u TJO~GI%>2]G4,;L Ļ;5>|TbqV[-x'!T{BboRLcBz—KVaTR% A<#oao `1YEY[WĂxt}q8ždzuyɥ7[Q7IXꀷ9S!$cmRU` B0RZ؋gs~\ŒX 7(7Yy4Y9@ I<=VYTjOjᡍC/ԌƫyLWjy rr9[1慆B])p.o/ozq5#3,lPO[>4aLV\-)d&v\7:6lQ($sJ<v*b5ltNfoǑ𾚬7OmO}ă!y+ ؤ>F cl\Ẁ:0i{ĩjm'¨~;T--QKbM+kdʵRzwfPM9F FIjmaQiJ ZgdRN)hfXE%s{y r{wSlupLҌ pM/t>܅c6r?'kѧϵH/yP xeM6ﳚE $~NLx7%:Vs]XRp^aLaű̻EV貊&}ȶ\]2h3t~zc%K8cB3_g zi|<FOG̬ڇ?k h7A(ZF8޳VumX۸n^zΠcI BtˆC KN,yCejE\EU0JayΕn?TX&OjT21^>'&=*̚f3K/Wءt7?O 8xfv礸N2Cf4lUqiͬje8dOBhu ? /ŌWd3Ryf,\?^ʸͯ{gf`W;&=x|z|V= ܒ5Nw,C3lgxw =,6v0KWt\,&)v>*G5'4W%/ ") HL,!zo;,G>u)%*,=L*E +7Cv>Nຳ@M#-5Kn=~-qҞ*Kn6YFѼ#Ƽn޵,̏yQnzyL'g+t`~{~hqx[-#UcpIMNAТ9pR2MZj|q56iR2a˫6bm3koI1Q-3u hU[:} d}xrIwyzzC4%V7AVL"٨Fgl+6Cf!|*!J"`lq_b:IO~sV Zкۊ5'Yz=4!!H P8(5ޒ( XW~5.(M`A O ZOO׭<('D/kEW&)^` G,θ͗vX0$ޏ0z%"64Q Kw4jQ_E*s;?(HQvFaK&YN5EL<YLIokUwہ$|[Ec0KGj0 p`kѝ΀ֵJ _|b;@MJslw$J%An@-u>:&+~w6BIry7~K1O3uf+ *X?;+xjq4׫Sqg`hY K=yj.]ps?{O;_h{tW!Iyq9NV“bԲԲv]KY[y\NJY|mX^aޙY@S\|UFjFhnȤ3+\{ғ+<{MX*0=`s y&~ e}O\ J o%rk,7F{%Y9eLU$1wA5Ÿ!D~M*m6өc}TۧZ\S]|jDLLӻ* =|~}1u.f`5]{2 ?jLm4Ğ{@v-hA1֔ ]ŗiXdY,H1}䑲ugtc#jw5]id-=c̱r}% Ed"F:y…,R8H3y$0 jSUP+Fb\ $\ne{3.x0N]rň,i]-eĺY\}=iɇG=*Qa0]-g5sKPv9DRRusTdP,lW9E2ēiةrdkIw}6$m\$~BkJj8, &Q(ibcyR b5^M"pyH%^"/_QkQ«-"/G |e2X y<BGnTRA{S$KBLŻɪ ll)Ond5ݔhڤ<5ا@#x'Ie5.h0% Gr7sn &|( ^cԆ MxFJ]|6Z0Wp x!+B18w L&j+q(> Bc$z;g1SmUh=lq(myGpWb14es+=/y>P eC63׭hI>B8!7M?t7idf>wHZCeNo$aC7_6RtlRSe2TOY&~ιS򇗿6Su; )k~yX 5:,?uxK Ǫ m1S<)㱠zՖKޭ2jzk),;2ъ+8?6i|@'Lo@~2߮mo8Ci9}d+`odʥ&OϿ5Wu<LCkdME[>Vy޵6blyx O;9&$0xh#9EK[Rb/Xx,EjQVBNˡ\@WTTx z =?5g0DLbj8=SFO41j8ஞ֤FuTTuYtTJK0IfPKjoWܶ`bn"Azpcbt3乡XDմq:_aIXԩ)N#C(a)NgoJ IJW<~Z˼*8K2:D`p]TQ[\&mqeUmQ-ȅFb×YPx3X ) j8އhEYu㢌 eemm8:A2A` \MXGЖ! jXZ8%Qr6w3hĸ zW9'у !@q0m FSgD2 >20EHHwGirJb)d*d{Jm|mܕVMP#NEMbT8  U0@h%BAg#vh .P7YK N!8hA9SpIJxi?hַsK֢U0P10r:!OaS8X*-5Ϲ=B0EyᛞaUQ*Bsᣠ4AuI@Cޛk8+xbg+ʅfI::1ܣGX]^]^mS֡|ep?u1R*g 6g`̵|:~+wѵY??_mnG Rˌ*k p\OFhp6+N0CpҲbp,* #kjt l8ˢNc@r#PY\BAme1!@:!DQ(bw7yh4:fУ0]uP#N+UK*wz!Zvq!P4:)({G+K8  MHOKx(#f=#㧣âu61ӣ4}_zv=oel\bn> v&̘qưr3m(@5D9PujJnؑvbfEX[R}~z6˚Ѳ)R"[ aՓf kWp}Ҽsof0p{;aձYy9_m_|)1._/7kdMic)޳ſ7iY1s uyjg%FNIApLn|꘩N J&(2s*/2+JXy6-L{ԀRڸWO &{/JDf['̓Ľ/~XYmj4aj`YaY8[]wOg,*|Qzz|F`(%wO>A>$ʯ2ep2k\O )!Ts6;aO.m*j Mj)?5F8V{6|GY(9)O:gtAugs't A@J: Of4 6YO7<;?z/6Bn?}y5wn,$etReeI\&~.Z~dewyQ5q<8~ٞe~Cʢ\QrW)s~VJOl:NxDa F0ۥ{Fc$ƮЀcWкws6RɨhcB`U@62и sRr:cyS#rVVEBß!gkIc×|H 7w^͵؇eEc+zP>Gzyo"z{ˎD7T[P &74}owj߭oxZN7.ȹ =UE#f SQe6b'3U3A}A}C q; 1bx; 1 O0o%^rw[]n[xqP8Op;eg3>S0D 7r{3wu@Awqd;:E r푗0 죡$t"t{+&Y(ԝ@L]G Ros˙ TVkXqs}__wueVpǚg.kմx;*W3=YJؤIK4BҨGgmrm61Œi9Y>2'H#krbE˪N6W$BUYbZ^i^Q\P;Ƶ3Ǥkttc!V9.{v3/p1ӺR ͺ<_U D/VKG߻u6Mp[v*".~\,v;OwW5l?;x2o~042^ܽ|w^6?s|tj}wv;0̫\2lD6 QjYd:eָ.NָQVA<-qTU`U1*aJwtMn1J߈WrT2Uh R Y$603m%HtVJ©ݍ'>2۶=aw^˝=F4\yl}B?k){zwZ*66/#^ڡg{'P'RFu5*u(Qpڬ4xl} xܚY:y~ zVHKqëKI`)Bxs9"OxE/w{2VUoP%5Gt=a}c蹠?8@#^=0E2-E1qpv$ct$H;&$oANJy?9'QIq!6ZQ" By/5'I%W`#WI qiZoӄt}=؊,PGmnCu;X,ě3^*Bwv$Rd)BК%JBmsv.N/ȴTCr1B9]jϻL *Ua*^9oE)R*3-{Sym$pLK֗K ._mT*}t fԥ$ o%Ȳ6{GS$@렫wN+T+w;yi#ZsNv]d 4KEʽLt*0|Xytɳ#f? ?᣿P7(:8J@FJP7Zh~{9|@z-V ͧMCY7ZjܵʮZvO V;'|Ѿ쳼^Q@p6^Q,YW FU{BD7*|}0p Cϕp!<) #"|hkrG*W0ȭӥZ ЉAס¤% /|䠑=r`w~~wrY'T:"\- xz@7^s I\ȳ\9`s'֤?,k3sO q8P#JS($RR:R*K9ad"?lA?6>@B\xD0;n$4heuHi(Ҟ ` %SQqi_$rU1q6"vh .P7Y /N!8haE(2z9&Gz*r10x 3#!`Hk(굂nc ;5BNSd%Vx0u^ mb9{I9_,\afsz^ͻy <ޥ-Dhp1BGzȜD P#4Mȸ!!= )s!Ք@z8rtAf" $T<"yA*ˑ`@R# ˞ qw`\X*+lQ00aU%bg ^k]3=Lt˛hwߟy:5[m?LS~,UPe53#Kmg7W7~r klGA(( iptI2yoJ9-z^+\TW4ٟ7Ȟ30y:nz8,-@?axJ??a)Aխw,'p&=Xy9?mWr< cz_mZ$BfS.}Zۥ"OSY0j-/?v?ϖ?=ng+JgUXWN$䝋hLUE5̵cn46hBJn-{M[E4K}DoO(;vŠF MVrnmݺw.Y2Ue6vt[.MD'6mݜĩ`i6n]H;,z;MV5n+mLfn/ؚ`'18yBu-mn"b=դ&EZMv kawשNUWQcNLKo< 2n'Zyeڀ|*ZKENb2u~&-(m3V %\sŽ2CRGY(k$綗+HGIK;@ xTI*Z8eWĥSWCjh$

D}kY& ::BL[`Sk4xElj8Y6fпvS6Ae7}~:C=u`O,+x2~:m}23{~8kejdE7 O;~|>i} ~$lb +҆+ Qs|(3\nG(HVJgFrðx8cZټ'hG{Pʺh-;[%Q)ٍ_=Mao0]Cz4^]ݚ|g+xz񟊊l'sZ9Q oiǟ;) ^V\zu;~ZXAeߛ'qk4G7zw1j ghL1m fvsޚI˻ˡq*O.^w9qaJ1Zpz&NLC0_{tAXyj%clĕp:S,Lq3#H,濼:ZVH)o^_݋]ƆG^IUyN-pFB6LNo̼O^i#DDV)%2A8"芄$^p/q5kL}=[Pʀg?,~qϲR(-cGcGcGcGVz[8Xt]eHȰ.RJH2`<\}Ҏ% T ªUc-z+PmNR!Yj>=ºQ IE"Z*t,ԊΉY:tJ#nC3 +gb)8ˠ`B 82II(p༧$AyP2P-( ;d! !v*?ȟ *!'<MpB6ԋ{%|= v, ;J)b<qFQPFy`bS6X6 & K'Θb M `]S-2T9:x,ɍ LRBMCAҫu ;2>9w A Gԁ *G [ț)DG,Y; 5DdIM,$ t\~J!+:bH<\7^1Թ zlȪ" UIJK!X` ,ȕC,X Q숀|` Y7 U)6h.,Xof!gd(ߵ[׋N:E~Ys' &gH8FQWKxa)jU~M093덆!Q?=B!X*/嫻ۥ #^G/!Ϙ?iezlio~ox}цOqVk!WoW{FK2HkpwkpONuRO F\wlL%)?+:fX`iW@4{-)a*0R*é9 b{pZ:uuEu)Ƙ~6Φ:+3r6TDuӔE=H=@h`hW}o$Vo8p>u}]gF@M` j \4_>o=c"WsǣC'5E?kѾ/]Ld_"T{Mvq]3̎[Wܡo׶&>̧[āl|>_`2f垽E__*G.w x\̲/ZBV/p&I)Lxt,*9^7>yI/~: 1D* JɂNQlY)pn`&l܏ Ĥ &soXeg,xr댁2Ң` S 8%:0T)A0!2A/2Tz jabŝN@d4=ܕm\v>P̍bhuv;tckުss޼iyՄ(Q6xO`y;Ozs̡&>l%~+P݅2.z1;@O= [7}cy"Pvz.8(l*x3X`2[2 .esmphdvA_E*P}vOѥ=%i?>g7&#` #t1CpW ZZgm.V#d FZHOtYRL~j%y/,/d-I,wtλ$JF`G'*H/;FpQ<8Sڡs)$0+XQ;$j/E;K1NsB"@ Ŗv)ˁ Ę]hDKKLd:ZڟRӽ7VQJpHe>.!4GID4ǒYmijO|m+iѡL)Nl cqgSPF:Qɣ<<`eܲc.w ʶ Pj|]dEN;?Gʂ 4ЂdΩ9.N¹saAj+ ʶ SߌVIQ",8Y} ؗQըwTv_2a9cϑ1(gT)J'Ol) ;z3y54\`ik3?Y vB$KI-yZ1/8Z:ߏiϯ{/L??o?+kIK]B$&Vm"LƗ$zϕF%S)e9@Q܍pS =ߘ ꏚa6sw8}!!KV.><-6wfoL*4bHeޝ8Kz[S$y8^tVoo1}t^՚\T\|3\m ݨ q#.oH,Z,y|Gjx~U:эat3p^,X[YZLF"~ocq6k|%N{8ҽDrVAwbJE~Su鎄?% \|.r/fzz4T5AFv^~4m{vDÐftj!.œhV\t鈉ƶE=HCFߺ:cش}ZZެL6>YU/8%56μyL&2])nJUKvxoİBiDҾRbJôk&) }yHfS~WQG^E*h@4q$1:h#^Qגcj$}7WLds,g{+^wZg|M;*6AsS&*+jݴ![SV=1$#++;丶F0,AnuZ[*y!g8 w"npJHyWun: X Gx<|*3誇rC6ӑC+];Jю\n9hGi!1k8ޑBhNʪ( Bk $Lg`GJs^+r7n5[9٤g K,("z߉^L8rvu5 qc"^G & fЁ4oF8F?QE)GVmӨ1 I/~w7Dǀ8{a#Gr"|(yȉ8EҜWߟv1=a1zB&h2E U 3%˶^f`BL*bQpOp4?gcӛ<%iᕻA] xȘpe 0,40ɳhG) DxT OgZ*LT7J2$6Xpr@X@+3,=S"RAc1;rzj:5f.I. uf#v7r @_ OT+75H&Bo9׺&QOs|M#gOtŗvm \g8c,YN)4G9FeʓםJ&ZrU\7Fq#;3wOcq &BdG^PnYx½SeJ9\TT r-#vu@13pB#L*/%INxG; l ICR*m#yTzb*o'r ?5Q`J^V wA w,О!e2}"fL jU7Jj ;tR γcq&+CO5ZnKvdMLC'O*',iK^g%?uαniQ7 EH%-4n:3ג9Qe[F<ۇp: MZCh^!R@ƤTStWS(P̕ *;7Dus蛣AiW='9*7C#Iy?'Aq8E i~K Oֹ1 d '\aґ9 |".0 o۾_.s<ʍ^4sLE)н^r)hr)h\-Z鵤s"jg# ˰uFC%R&Cth T9i6*Xm^K/뵔:]GYXqx]XZ?e"R􀹴LQcC6Nl 8qِZE.s>z%Fg! B/=ch%7̴ Q\[(K|oFfRm-iW۠qkA>91 a% 4S%P4ZAkHA<s90[&)N}ao< CHtDGÂa *-1 Z8/-"!Lȷ|u4Q%@&:!0If cqt\s1<.r "x΢$SRuGm*!T۪C`$WJ .2< ,a;;\ Jn0UNRN"`wG]Ẋ}wV̈$.5nSc3͌i暥wdgwM]SBso,`akM-5OÍdbLb:cB^}H#DUk};OIN1t~g5H|pM{~.'<=pK+׋7 )(Q$ ˳ottxq~!/PR`n׿] L`4ঋLQ@g1|C?_fQܯ_!CXD[goƹ<{!Hn!kc3%T!pOF"0EZ'gK$ShwyV9|?[@bYЬa!jb!F`]J/0B{"Eŀu,NpgDM`cHd4!xkq)= ?A;k)8CiF=GP t5CP̬ D΢$ R[*GjL5]BCƒaU[x%I 5mmoFHL7*B꛹ں<ܸU;K-Ă @2A{p"U(ĜG!&gXi9 }v_[a0 "Rrj+("ZJ=K6ZBDݞf7>TWHk#Zd׆Si&XҶjL!i+jVph.lA,Fy0 atUrbdT)BS W!*؅(0W1 bL &h Jլ)`X+@ X2n\"҄$Ua@B mƔ&ɚ'Aa+TJ{9ia[g"| D-Z{GQli8Gx,za~6nWJ K%){P=i[{`bVNR$̐Ҝ{e J(.*{Ԉ"EB# hK@P<8Dts8pFJfG-i[1&e`\=Kρe tOsњ3H">3rMy9EMKWJy+H+dmq\r{wBXmqCd1j~5 NU!$T*:K ^_o dv|NS"RN;Md/t@cA%rC\q9gv>*E0xh_!z"j!JE ֧<:F}G?wqpo5^рT.gik]*1B(;\WvSlP[XEBC0"vAX|: LbKxטMEXȯ p8<7zX_?WX9٫ yvXLұSOUN86l8TqH2*ۮqB_9JO?U}-ORK,'Ͽ+Z6K`h#UI$yܡKO8 8 8 ø, FU-JI\{j5(,тHhH5pD/*=y^rK>po\!4};`WK4ϖ*c݃(:+ZIQE7+$O s(5(A$#"ʍcWȚU[!T jWsa|4VI* 3-<abkrf`zxj0L$p7F-Nꉓj. ҝ9˘5m^blӦa\U"&H Q;ʼ5V @#܏< !`쬆@zCXz<$3H#|E7" U%MufS5)R'cTi)$a~4IASދ[*Tz"X&1"өWP03:G^M|к&iZ.|p5Kd)YNkz4˳Ofr'G|-ס *ho8YfE6gGi5`Aڱ܈ǫ>^p;^CQ7,33jt"5ڻN73O%ZdDXwF܌f hJ$N-h= 7叒ں{ݴK#R"öӲ_kYtfeguiBjvL?~_}&9< ًz nӚi\3tQߌ߯O`E-ٗ[ploo?3<dV<;s8 ؤpjnH}D[dmN(t\5QNwXZ9װѣm R]V0i*-#J (Puh<|0\EϢIxW`NAK DZ TJ2(28>Hh ҘTZ#D - uiZJM#¿lAi+_l ?' = a"Y$Fj:T% t6TENxo2⃐`EQoiS';LQ_q6Y.I̛LSI}>k,]7G4Uv{ZlVg92|a2<mG3J[#}iZo)uBs>/O]r8h.UqZ>?M˟/EҞI|H ^yo> J x|vg[%Ì&UIbGB٫8 N*ddKi)Cżt-+Of2&@M~NmVPin6oYDE{.\wE"㎵,efeY "us.v&*& ɭs*por`H/zȕC4BCE*0w7CS)j~B} .BBtuܯHI{lrcN&.Wh[ Sឺ۝cPWd BM1UmNJcxA;$#8fL[ "bm*apS񡸺c)|y۹H\:s]XFRt[ZyU`l"(֝-$-GEUƔ!dY]O=]ɳ a C*ej[m ,ٙ:o`g}v>b-$k{_&Rb- Y_"ZϡstgaNJcZOfZg}rP! ;]ϓ rv ٫(Qx N+9y i5C31HmM~Y[TF mQ X>w3uq/]?@BR;ie[$?y۰!a?Na?N|7y2FM()#إSA|ϮO/.ԝ/ޗjzYٻ6$W|q2]K3ؙ:%ڼ,Y l\b7$$rHQeV^հ?{ƫNK)d&=JKG~d9P$/^# a,OAlE<@+gWVbZx>}u4~yTlnnfI_ jB y3!IEf\prrVB'1e12EDRNrI8dѰuNnExR\:X9WdL>=w|BF?̇)&r-^d9яU7_NɟtW/1z4WF~#r(ܷwggȂT҂?|{tF "#Ÿ>=ӌcĠᴘF3rww4Tֺ oШX4 rs44s1 :HUvtg[/LE2%-*;$ZBD%PK _6D>!u .駍 Z%Ci"FykACnL( ǬQʭ|n֚t|ժgEDL =1ĀZQf!jMY@Ibz+܌߿vvʖDZPۋHf'yE;. ~3!9U9n 2-͖4"oƳWByҾ;s?: 0<ѥˋZ뻷&Ws݃;EUE\%Gy'WPloO0/.3хX [dsHQ\sϖiG֜qiwZ LXjcT7κh@.ߞ|V]g*X٘ #ݕ9{ n1zcN,f뗪\3OOQj͆hzקym]Xce";ƚBG]R՘dǙ1"$̵ür={T7JmD~>_ev~,|{0F펃8\ݛu6,c*2>wW- |}_Jl~nqa3(5v-!svL4ec+Ljc3K7M M ~OyK#xGYj Њ,ɇPk4˼M<}G$WQA)E .ui9eZז.MƣUUR?Flq?g*59dNfN Q~H!6|0w670ib7-7|]:WǾ|$a v(q({k )ң% Vonxgzb? UH+YDkn\sfk5`W/Pa,XYg@| Ѐ%$X@Mo)$>|DM_KV/:dtЖv7C&`cP0X1avrˠZ/a@?lt ,`W tH^n0c+B<.!mv?­ϩ˨z[b %% &_um}BsCiMҗDajeK>'1|$ ^d{,|`]TfK Ea- IVԐ~ւaOwsKbMc0(2^iwAڡSztdb͆CN9C)̚ن c`x&sI傰>,zs\CPg-8!% W֥_s J잯 Z O&(@! 6z޻rc[dN2>($Oznś"#(|yPZ.ܯMˆ20pLޗ "Cifo؆Y cYkwtV$[0<=Tӧez)ql)pPRњY燬}{ɯiAv]{|&Vh&ɽ!Аjr0y:,FIzEhRB$ N@G䉉EbPH^r"bзLU$w[)u.WF$hrA[R{͌.>"&'``׎t[ZāVcZ(nr8 唃)'c$Il6- 3 GBY`d,DBPCJL k]2#wa>.*].WFWY{#irvZ@Rx͵ ¸:fעP9[í6kzO-jeߋ Z"lS=޳ P+ {iDFT.sSu13.{tF_wrsvcG1:gIaIaIaIXS,)FEKDRi4T1 zJ)Ad0,g~3l|c44%M^E07:}_y6#T hAQFR3GR޾_l(8t5Z~}K-O:=`JKS>,`FP1KEg<+ WW*iYn`#-[e[LAZA etJC0(`ZugqlL/w!ObC2TTX-IJ0t c +Iڐ[+O pCI! IH1-(][6,f"Hq.}ر $;U >d:L Bp 1#8l'6վٰ-݆lҲO%3*usf(r|P<+-:eOAAd}BfAپLgeaQ#ԏu3奏c6ZnM+2]vq-2S,3 ^ș>hSB$ؖ<b&"BJG !%,{C.Db1.sG>ʙDAb1rL٣#͖{y(:.TdJ%/#f@3%[\,l|zZ:pe⤞6")Iӛ;bfbfB Iv E6D&dXȡJ7ڒL+5JC$h3p`$c1E Qnr =,mR$c0pؚf(N"u-(wx.j*K6沢[)L*fF+h`4Z -H@؅keеS0L?pl|vݘaLfrxѼ4Wz.rK8A\^߼X}m|zӇ1JQf+z<z<]-P~h#_#NU&6__^ގVZ Np|[%2OCN8)9زz+UIN.~}BKcE1 cqOcYjZ.5`~xd4yw8f+ސjjվ:(аd$ЖJyfChww.wxmF)4BLJ. uNn!i&kIjH2y1I%˙d HD\z#H! 8$Q[NC(I5/w$=w"pFY\e4[o Ƒ`咕g"pMZ̻ I! M '+c4(Jh 90xIZ3KJvBV]ĦEAb9.'Ɣˊˆ(Nh85cR?qCpCmxݯWz@ٰGiu]&K7zQȧw~%IwJuEB@ty!Q 2R8V*1(Q1Ia8g<{TXb&6(iF-.6v?h+iR xo+(E OK$+ ̳9@9ѓ"'EFOet%T\m4pΫy5-E eBI(=|I:*3V;/7\Mɇ$Qd 3.H7*$8-.QڴBR ]%.)+N@zyQkRv)C_T7J2 ,H^qC-2>dV +YҒkz˪ =A(p 5WBl),- Qᠴ>#QZ|=RVKUjIiMxTso4 "B9|{fS-RE`\@Nj=ئn~Ct{ȉX@xCrׯ.X"6]OU Vlf9g--}蛋e:61}{v3kP9t;2GRv*=84EՈW[}ȷ#v{'fېmu޸<$ht/gZ+]MgA`}^F轭X<6w v0z4XߥuT+Ӗ?{_Pmjugo+-^lQ=SrUE,y2ULϚ$Y0MRO/oV9%7~KxI`>؅;crWSZR: ] /MٓH" {NbQI(RIm^sQ%HWTꦻJO<(9M۟ʍH (usӣ%__W?;>jju~/[Q'NfWlIyˠw>S"rORN D(aV6׺ qV>^D(z6ԫپ]#Y 2TRw@6mRIK7N=w $p3ms0& ~Mbz3;S]Dʙ.u*u?Րpj9Q1FseyvP*=S$!%Tr' ^4a/5O\FfT>~K rGj"q g=Vcc]cV  7Rx{FcF lQ Z%/÷[-{wL\6G:pU9gd ;բH-nK7,(DzZ&_rə.U<ࢶ]PK'Z/~IO_Rœ>/UeC@%Ih-e??]1FOF,5]{%zuQ'Xnű1T{c88ۂ@4~'s1-Kn::M{efo`/?jkYV-86֨)3yvOvه|yI]=-t?sǶ:MUi)<(x~嬿 l9Uw*.~V/N%zYMY<On[ZmT? Lk&*Uo "AroӸC%=ņcQ>4jRKoO7kwT_xXj 8s #,y||ރTkPnfүa2q&mo9_+<4-XaeT=cX Y*2#Du^V-P@$/Z]Z.%][wryT B7 uvxwɇ$w ėNMN^^/s$lcKALh ryvyIQJNQ͠BA ,a*m,YleqΩ|{NJ WPL7)"%u:m0t:raozɥ+Z810"Jgў1^pV}ls\:^/rJq: ʹ,'Ɍ\ÑaioG@9Fگr늋NҶ2iJJY`P 8kHԌejaI(x'ȭRx2n88*FW20]|Xhw=4I:r2 +(.p mT0Bia:I'q9(җl+lLhiS]U@EM#/2QC!hKgtvoޚZ` 3\F;Y\QTDٳSMzEV2bAeVEnFF5Z64xVҡ8e G({Ӌ2,|˛Ź; thuys"`.7JhLmԷxPkK4f֔K% 9Ʌta.Dϯ]Vi[}A@=Mbx ]-N3m7%QRR$TIEL= Rɷ^<\0kTb+h9r@G-mEM`nA@(xg||x𾌱ƺބ){Q |,_ N1qMC R"! GCTd뿌x02# p" ϰ^_> Zk, E͠F 0a &}_v1$L4sߋbⱇ1 Fl\_ծ/((i~_x}uͮ01kCz{e\s,L\TuzJhIw0xё7c>25%Ѳ}/nR^f2YcVHhφ :x'F 4g[6x\B]^h9ܿzȡcYy+swrar%{z!ii=pd"j%_ǹx<E b^on(>>pz9I`N@|/mOôy׷gnI1\ƹY)Ed(5/$9Z7#qN w-YX?dMh\.åO #l@@G9!I#]/c~Qyδ^ !*Ȧ?닛[>Ք{#"(H'Lc8QIFqYЖ=ceD R$dB)w*Q̔qT4RCy|;YǣwJpY ͆(.p '/Tҽf#FI#ghag].ys68~exuu2ey*?ƤJXߪKl(cGDWN{-"gi G1֒5Tb⸱ G'K7n hn5ք9\+ 1m244G }oCkesf?_KŸ#B h<$BiŝTK V']ˉ^N Jw$It1WL (++fHXȽ s+lD@42" m`\:][uJڪ 3^\e+ \@D&ёӜt,ּ%J"XKzc co&:j3gEVE#ԁς(UU_9ٷU6`@)t3^f=mFa%:}ݝWH.}4u.@P,Jg[Iq4M36p A%HV[!O{d !1bPR$Je1୯Kwi_rE$AGr!JQ9<Ĕġ ra܂o!0 X@^ 1`"e=A2Pi[ #d!:d8٤d;w&xi Bb4W#Zyp,r2H<RcJl5Ԯ̣7b+#*E;A %x8W`lid1Cad,DYPFvH8&<Lj&m .X ɴ) Q`D_jdJы4mp\} $QNDjԯ8jEPHB!2i/%FSJE%rUs;R K,b zFەKD"Zf)hRК2UCFXfZ,*fV1KYVXWÔ ⽴@%gr2Bo\L'eKPֽv6 ~@kr~)(5ٻUVoKK}y[KCivd ڦ6kgX%S3hC $;.0w?Bo ;ay0i5kgf8#m}v}靅E? [nN{{gou{ 3LJwwfOo>~^/͞7 | }x64w5j]: uyt4낺'՘Lfژv;u?q0Ӏzq*=<:S }sͮB(2u?ˑ/'XMd*LRP?az6ȓ)SQ+5re0G8K. WG9_ $ W턞9?J0ob/4°; rGg9O@3 zIugpu6yh\kz~h.:׍^ȏ ObO/WײJz^םv|$|?ppbҜ&lϻHOWyz0~}SYjug_Os0?uxyuz? ht'x|˩*/hM_Sś46<*~ O؝avt{xkh ,;k_ө^`5%FpwBPWjqz1B F0c)F5M Ryg&ߛ{|o߫U~C:#C(͔&J 8*Zq/С뱄+Z_eQ!ZZ;vJD1 ;V B \hS1 $T;m6BD^r@S(\ZPtS۷[5~<٫7ĥ <bۻNm:mR3&RHp0Tn@1ᔵP"=# nD!U:Z;84BYaBC֚fDLsbЀpF# ADzPl #,k)u12kEFRZ:g,˄m\PY}Jn`^Ha^Dڙ`3`,-p)o4| ^xCYrdfK&n]sa@`|:0FDϹS)n(cԑ!d揦 i..^x5jiH n7 ho=hi[3pWBYS7ag6mКYp};8#)qmG(>èSh WxQ)w$WJ$D"=TP򎯺峉|>iyHBb`wJ7 ~ċ,HwCOm-6"U'd TVoLD>@#."z@+& cbh**#^ɼ0F !>Se20:!y52 ƫ`-v&N0.;)D!Y.5TQݝ:S"3)E3E Djm! &X0.UK bH D3\Bj(S i"6x (reI*:R[k2m/bB;ԳkN]<~7yתBzjeLe>ʈbH:q;@$"kJ",Ty X ZDVQq#XBֆ}1=Z^?_4+f$ oV\J֬p"_\-s hݲ*ma*(lBiR0W(2F-F*'Ҥ h u%*.޵}MW&e2I6M H}8MʖrFٓmR.ak;%H[5MUӣl/ׅB֣l*Xb+$kӣlw$X"%fN-CdL4<EcJL}4|L2qz$3 xr3 :'R*<JWfakPL Eq4,3f4p"וٻ6dW>9'aZ%@F]lj8y8}hˤ,Rr Tu(J5m $tUuW%HC'XiТa|#"LB_iحD\jWvހoPݩ4l揝}JcրTw(2}w}qӏ]j͑! RzBj՗lZI63>uGҺD9nu< ]8OR7T%'ջLJYo .{,&w\lSpt|O޷.F5h6g7cw4o702NR>5JV_+rfyZ/qR]zҥER4UVr_AJ  ^Y ;ivz4_"2Ç0͊-]aW4}S_QpztTGdrW'%ZADQ?bJoɭ4_J]+<ޚ!{5(" aoF2K!NM"" \"C|,Nm`EY\K{365o>a@\k[6|sdtp5&<4(hһm3^]ٷв 쾵в4Z3,# )o\A46>/W*KawMɏW7\GR14t !*j˅gB4Aj"=C8H.V*D3 Xy_ }/@+@@(k#D"H8-@;n 1wX)T`2CC~i/@2jbU#Fs rgi#F %vęY92r?ՠooD'ޙ Ѥ tЋdn])f`-߾hNk?.>vT| uԿ{5Og>=9~uO,(۳ xB$S9-EMZ2!GdDXbWWM``Dg}E$Cg( E3N}.+/C57ymzB3pNWI}lt*+tx%a"H'66ibU؆#)8*NPgÚf}kJ(ԠUm: #ƄEx{õ5\I`l*թGp1tcק<9}߶kL߷5`,uHya8Zܳ׽QKCP{4la1'H0K Oen{Mtz<׫|8Xx\1:bz] Xjѹ\/䀿 6LKw |rFFqH+zBt녊>^;vP;(s;A8O+̨Hͩڤģ5gac"r`9~*" 2 ~-,{쯽EQtu;UYt ^t6PLf2"U\3ô:W3[?w<_-Z8pt:9jҢL$TIN 1iwQ`c\8a圗v 'ۨRM_ٰJ߹n7_Xxu|EKCcE)̧azwj.`Z~yO~|ճ.jK-~p^8şSz?Ng~w:J{}W܍&ٱ iV~yR?M\m1/ᯕ }Ւ@ܾL4keV(8n@__p˓|с]tpQ/_ӮxiNnP:_/Sۤ/?pQUoOk* U6W :0U72DC-/He U`lbUSGuܳ%(p֋(Y0j5:{aaK}Zӕ ѕm dizw3h ʿ}~EzgB*T9GJ&&% 2 xK ,iDHtAO!ZVUe}:j%^?A_"XfX&lxn/1ZN&[ƒZ0:V7xW\&\V8r/)kxj6$O)!wIhT[!K C#[j :JА6.c {XE.U_fn)ϪOLx/"a޲"*t|% M^XKFBaߑ ʵ|^p<)0|'Z[=mi~ R\jF͈$NƳoةZܬڑ5t;B%|8nUZswogIJa$ ݮ'*nZX@>,xC*[#\AcFRJ+xd&k`3^$ WJ`$xb| 45hsV\_rqĘ0le,jIpBtHI2ez HdiͦB +WZwcc Q|9@Al3M,fd;DxFNTx+vn/DI: Kad.P'櫔T@ 9IJqbpBPm" X([GGaD,TLDT1W(HaBm6BhݻM+mJIm^ޝ]wˡ1Zmwx s#jilb+'nUnҼ1?{ǭ@O{%4``ω ݓ g'&2'@V2鶚3-ɶx4fU},VnQe$J,l˜){W@ ZILa p)r3>5ektWP&} !!!|SfIZea&HMSPDZ4aTπ>:(ФE2?z):A*`G<(\˘XE$d stSuLa^2lÄ>sf$=8Pi׉yFfgxN)`cpt6 1鐿z(V&QIr%(skQ񍦳+Vwmywb :5)1)L*&โ5BJ% R' RXrQF(5!yۊI$܎bF71 &nQ(f@HL_M: 3fxqŤ)pdxL.b4.6|3Őm3=!T*DlY:y]r;%Wi#`(.pP6Fb+\3R2'9'6fMυ4NKF-fgItTq7 4lQx %xԂ-usu2xem35~T3ѕΰX!a\DM}=ֆO9&+emeDKH:j, %#C-42 36Iat#gܻY Zhy%˫{}idx%# UBӴ >֡BvC򖷛Ը=f֎yC:)d+%%5zL KvQ=KqV*EBV![)RX  z, $+3 D"]KȀȴt GRv-%Y|SgJi(zCzmT pDB4T9QNV;џnU'HH^6J_6*J+A&:D p3Q9Y [ֶ=SSԞp!}7:=ơDp-PNM1_i:&PI$haق:gԥm30-m 0IAq6=@=h+">;_ip2Ixnˣ.*m<^ʍ _SƗL^i\v- aAD3Wh~/<_\QQ6+cs+VE{;Ls;z}?<66yqehz(ɬRMDƙY)(;M4r4Zz]'5Oϗ?kW 2F9vhn]½j71ceЯ0+I/SVmd1$jV9hmkn#\E `9;}_}s^l Ԏ<=xaf^10/-CO*[l+_'R4;hwZ@piT%Yzapʜv*7R-L0m8I}rtoPVeu0d]LbG- L%L*!Hdčk^Dcj* 2~&SesS:c'ەХNDא֬I3ǩcT)a 7hY")ɼS}&RfhW1{EU Q%K~+ѡ4i bw:ySϑ &L %{/7h]_\{U$4^ n4W߿ AY}Cgr]sgskE;{>ߪxv}yvJN$QGJ9/2_jkC^zP+?{N8T}C .JmtVNV'?hDC PIMd9̝[t=uЬ@~qW \Z\6)%,>~.ꭄj&0 $0f?):SEA3cc_ T d<xmSײѬkdz3WE HHkdz% 5mlLQϧ" INU, ϭzK=Pv-wjCD Q2? n{ z>HO"CgZ[.P[=#v.a(W9kP{!OjoX` p>h4H =:eSd>Fh.c*[noK^Z 19 -wBÍm95@#U0a7zGo=ɞZ\NR&z)A\^{D3Olf4 b,KgZwbN(cU%j%흼U"Fݟ,A3||x6{0g\vN旔D8hyuht}E R|ü )eBv؛ogDt"W՛sӴg”k ])IC"|0[ʰ`Yޛ 'ے {qmQEB*鲮2ѡ2A?q)5׳σ;C DY"s]z=1n~ryN/ue Jmas JL?zKEgxZ#xpח5SC]_Y`cjag@OiPʩz~I9 5鞪lTi ֛jJ]W8b[Vզm-Oسe7\ 3ܯey#1J썵O9X76?Ϗq -ã/VTr뫴\)tn8Xߗv8~|g?];(>l/9FZZoikRl_Ƨ$zǕJTskyfo&ߑpG97'SVQ'0h@0zGdp,#M0Gu:kbУ0WH7!Oj2dmJ%gY2%J} s+ gXdF{eu^ӸV B'>qPĂ{5 g!T3uB{}0)?Vۺ1֠M %؉&s!!\ G,x53"CcIhe%! a,Sh#MSGn4]=܋qѬ`1p#P:K5w嬂G1F A$v Β"=8UKkՓ'W) ?}?ո^> ^#*ޮ|U?Mjᛳiv|yq?7_\v~냣~yt6G7>"En޽I|ϲO(y3hDP}"Mڙd8dz'<@@c9X];>^3"Kia %g38R*;!t=Y:'JVt`]m6+|9p[q9 dn|ɢ!L'$[m)Y 63TzRXUJ % usA? )ԓ)ˏ2`i>ApAE1O#=(4t ?i7J ܀d3'y8H9A4>!&.uckx"Ry}|6Z7|M/;&QE[1]3(ӥ^< s(tZuwbR|_K-{z bM_Ƴ藉C2{~}м.۷jѕ}FP{m=['t77yK9wFV9QNOhb4^pӸ fu4;]PcL:KQWBUIaULOtϭݳ*v{egS 489יFYdy9n0Frz i߼!\_0ЅZ̡qIhjsxCVF~ÔўA͟KP CA rr F+mFJ*TH]a@! Gf5ʾuM& F(>A㚨JJ72p>P`^k QyKc˯ϟSGt֭'_vS.Kia6ݧ{ zCoUOnG>fh^ދmSO@{XXvBv' \L 2!P݋L:G +J\Ֆ'G.Bs& hNP#llvNdyOؼ )}7lT-0˚;@RӛMAn$ Wqs-X،ڡkQ -0t */bT ]$&ٿZfjM a5ro ~lgu0Eg VbdhX#ff&C j:)i$ f؅Xbdv7@f<% pA{ךNI!7m+jo є9 Z(c9x7!QaZָ0ѫg;J&ZP uURfhʼne2Vlۨ`0$F?l*iUE)L[r8^OsF, **4Uqkx !C}&C Sڇ6\հe414 6nG0'b`6 `B`v!@fbL->[q$u*bt,#QqF hqhNZT͚$վ7ݒ-¬x߹Ew*z:;5SU GIS>!g_5Cξ.iXS 1HosuŪo{q e/rln{5j$uth{}.xƆoַGaioJS}5_Kߴ˧EkX[[qׂlSfLbﱐ?Э}rJ= 焠2U Ivg7J1 ^&oqN3.ӈ_ڳAf5*.d" pҩ=$nI- b:hbNxNͺw,Zr*SP;hbuCs-ՉGv(UO-~NuK!uJ0^a8v:ڌ? 9$ж m𴡰&4 K7ZsEv{'J\ЦwE/_v`@ЙZFjl< {AٴAȍz: ꬇TW CW+r F[cheG,$$=49YcH$o-0,=o6b _f{\t,#If3F$@װġ!W9е3f]g=ˆ9f v 7L(~duJWHK99.w"%2L6,(TVB`F"ۖrF'LK-U}~A#Oj/!rBӊ+M-.iHnem.s%J1ʝ'k-\Ң8esF ,6uBJ-a!; |oFP85 Ç5Ӈ|kͤJeQtZ̶J֕x!o2/A`7Wp3&OֶWu+~,govjs|M{pzB ;w2Maȉa%SSJDRO@JAK?LNoGJϏ$EIȯ%=?6%Vq$%%S%I)oIJ%IޒK{]I~+[R Od,,i.)`W"Zm'9F߼ )ILÇw>G,! >6xϠ'iF/ьX~ofgVf!-%ZSĽAW?32]*)R?BήgM 9ny5Ÿ!}!ǟ.IzgY+7͸HU%琯BtlÞSMWՍ8QF/3[H^kS.fr7s3v;B D!~(}H_&o1v{'H_;^οw Ui \EiRQKN:RYI'T,8A{i7# Joz841bjDBV*\ɨw'p/rS+@Oyc2B F@ d&P3%\Ecgm57$Vk%e-*@9XrNd(p8X|FTtK4'G7 qQvJXBA溢Ӯ4*J`u+sJ):r8prQ3 B{{A濢0vnEFo e_ʉsR}A$.d_!W>Y}˲?2@.& 5od\M (u* nҕGH9.,E}KidyJ%{kqI+CMU"W$+xi3!L^ I):lbUx!>jj RN941gO0v5!C %?raKud\PV$7˲?f&?rAvPt|Jmo( (4ٸ<>8 ID1'$mf|g|W%*! xUPm2%9-<*1FQU7oym:ݰ!ڐ}Xׇ"ĥ'yQ~]&`j#ބtq[~wQ.) <~xwd~uxF>DS,{nL38=XpE}%fB2sd֯EeTB©L) ?V?pQsOFw>0ѝQ-}y)BB.>)ABTQ(^I- U^7rKcoZN{fK."UVh (5::w~ԇ`gI5P~d0>v_P N56X^HмCK! ]SgDa=Vڕ >W7giW+D|z:}%Xs烽"OkgD86rTl_?:"2,~l ,'_&%{q.(b(뢕#ky]w7s dkZvlNBw<6ꝍ&Ŷ$klmAj踦ց]&ڔp;ϸ"+ɸj6ʒ rW+.&Tzl=|_}>k?G=O+?ӻn `Ҥ>Kl`)I5xzE"/جf\p2yͳ֑6Lw\3"K9O?iyw!3~!ΧT3YGJԖ[L NsCRsNg Q5ꢼ\ "'i+D%8I', Aڢ ( =9Zstq;X rA3Jo_S!Ife& ΀:@1G~rNUȡqd<0R (IȳBC!<f(7oM 3XSsoUd"z>%Vvτ Eϡ57σO)Wؓr=ӵ=,q$JT<B?j [I>ޝe=z-"uU]`f vI=@zj̿GPCS$gS>. ي35F w >.d+:a 4Iwy#Sw]RňAD_H`jїxu=M!u¡NtdpUzNOz!^&x^w W0>KصbQAtB^Jֆs<:*yo5Oߌ ,yzݑZ 2\j5Qw[?~7(wcψ:~W'n"YO|㎞Q*h8B`P[ʶvQ2=DV:"@HXMiNl!'#uCfeR]Zr4pc Rd _2$JXVp}YdUBUV9˶lCVaBӇIiX5|i`A4_(omcA8(t9qH\4AR#K(;#!eEk4eEk9kY3f49Ak5TYֽ}|1{t~U'U\$~\!({'.>%{[Zn9EfзvnݽZ∴6r{}ew!]KcwB8m6Q < d#N*Y®-@oVukR5x k 3]G[1\ RUף.1b5զ _^L]> (.r5"]iՊˎYwUzؒ~-]`ws4|ѮyJ|jwxR ;R`!BB$7}hi)/"J܋GQ24Q_31-=4܋Bq ܿy}$ 60bQ-8g#B萓|I#4  A2ZDE02ў/7VKaS܀}&S,f w)emS hKX!weaK9o[@מXi|qqFnHֆ \U'Q9yƱ~7ˁWEOprʦNpbwDؽ^; Po\v-Mnc|aÉ{Ϧ[<[:P-t9Eڥ~۫ 86"Z!`m=krX'H*a޴/{Fu;טu`}eC3ffoڶE9)-SOcglܱeo{󺛾߫BV(1 qw%< ^CP5K;ƿ X*k?5ӗBzgR0J&ҲnY|#DmUJ-QSeVxFr{z#?(vP[V!DN ZMARrQW#JlM|*F>XPOX{sBpo*s{D57Z%ɷpNFPFݠFdv-6p;'J^ntmѱaD]¶#8yGT1 uX|lQRZ3܊W :GWpCª-cs;JAUAa?[ aGMJ`tTr+uN c1ɚ2) 7SzkFkW5~yn( $[٭pkV$V\҇0_5!0a2Cn q#q!o.IowEހ E^pRsޡ_<'{j 7 :+Z}bb[s>Wc})۠ү_k&k[ GQla,CD:cg膩̿h+u-av z".H5jؽRhUp0Wxh">DBk̥Ǿ2<*F{HHSEcؚ.H9Ë Cq 'HhN0К i ,7'\uw \so~> o]'hٸ,EqwD`X,y~Ǵ}OTF~FM^S݂i%Lrڒf3dNzk.?~ɠg.dqw,A0Z}Z.ߟcGWA_a&"ɕYKx`򔈕q|ű:3p>Pep0蛐rHcc0mc _XI!1ȃ UE F!d߷NJWYI8|Υ@aaqR@&# fQ)!1>1F G2PB"AY b/YdJv$ XM . -zXW]w sI~p|,2;V^Nc ְ(c-J׹-{ׯnR\k[di7!XbU.Q]D^g>Zh `MkSv@0Q% $jƍA"ݽDW|o 8VyYwS[Vvk=Ԃ59,sh{S݁_&7ʖyK{l0 x&UIӍ4v'6ϭtšQ- ?&e:Q\Pn8Dk%kԱumLTpՈ6-+X6#.o$O~܌]k3t|X{_?B.n6񾋾B\H\BSi;-2}z6Y16 J3%->o@Dյ_qgUz:.Z+XoXҸH7C-52aڜW[)߇&fBn^if24lfn |׮\XWZ<W|}GV*c'vS5؋r;ƷvB#fkߴ:%DpB̥PH1Pu;Uc{!#S):9_AtkEĶbQg|N#;='!|e $p::2fo,̗Mz_?{Qӹ5hSp>d1Mt'|1No 9ibZҠgL5&LLf>]u;HrJkH?3yPܖ/8iLp~ &5x,hq_?94Bv%hX| rGȈǣh>\ƞ~brgk뫗ٻ`CDØ yR9J9 ΋h(_R .EC|0P!Ub( PU୑ xTr)J}R`2^[4.wOMrȌ&&L&h|vO3O\ډ .ΓCl& S/\k]|:us'^= ~5GssY:|=:-n|si&w?3˫KNV妵!wx—'HŏL_9H0>=xww?M@BvË/3xo dta`cϝu.STKF<]iW'M,nz=M5gGhp\<]\7:G)jc?ZyV ~Z9qܟ !?.E$?WzP(C˛&ɾ|,p Ovt '{Wxg^V^IHfdo{ t(ңB<~N6GC{]|6M쟏-̛fgw p -j4ΤaZFܝY_M̎/m@gܕ]!0һ}, _[zIJ eك\W@fh]<ܞL@aLwdz#*@`<x5PLXE6 [r \; [d ~ýp D\pdi6!ቨ{ħm\C9y|yA\,٣Jp}=$plE8ciS#ZFSpb i"hR!%F7A[Aݽk.e~J}g@8&#m1Ay<b)LDD0 LI08Q9-׼:ϯq4ĮA e=ԑ{q|qW\)(؏bc# Cb0j{8_!p;?>.$O _cm$+8FP io"{jL sKAW9gZ"}w_I3AL!Xg᱘ݟ۶VwXbF]/oR[[mp&U7pJkvOF*`W_J *7#2>)k'&nY9Ü֘u4ET蒣dƉ[}uY7Y.Uڔ*lP>ŏmB2 y TZLT3sm)y0+8mOXlq|RNem ?ɞZfIՆ5b %:@)n%0P? g*{&Jr ,1bp,vNY|Y9!bO3jIjV_GL:v;$u$L Wݻ UWBUPuU/T]u/TA ģJ";WMSt|&y1)S4Iu>|K?@KFZKKW5մwvl:쨰UD#sRa* ln4ONXQ])dz0RJd:KH'UQkK2:2X>h O}*l룩p_S79Y, "| V-pYĮm)D(M3Fg%:]R1gV4`-MRߩ?X}1gD rG.Now-?W窶\ՖnϮf51mLq}Tx6<S4Iu>`U8e7{ii*i*i*igH~s*Wo@1f z^nKKxtu~YZ|i׋m&I4I:Hk9̼RMڦĔ [B=ibl"ƭñs/`E@\;arKbĹ1m76t: 1 CAj;TTTtĦrI e圈[mFgFgCPLBMΦ݌^[875kԬ}SM7k?yI)BTBTBTBt RIi-fmݼc <hxkh GPgq%y CFq~euK֥*$IøD*Xr XrjۘZWOٛk惋t~=e{볶yxyt=2z3gS^^/?2pRǩDQ&`J_bZ7Ezc bj\|{Ƹʠh DЮJ,ZzK5Kuw^J'd)dulz zz+ )uHU,jZw6^O:Ccy|?}w7kQR?߽7 H %o1\g9M .?CQs-P[Q,~˛3͔|>Ѐ v?|{q%0g sO WhZHȥ:<[|@ç+Tߪo:q;YX(Ns smߘFa׭y86lڇp~{'anJ1*M8o`<&ޖ` 0%PF H;u dC1v{t= zéVj;}!l K g7cC*fϲ+"eIX]l(XWS\sgّd ޘ,@ř0 X+h"'[L.5/:dN684/_zܹz 9RP iz\I>Bcm\ggw&oayV#EEt/2KÌ'!mbx0Dt@0V1e4`HL`ш `ꋆ 18'ێ,Óú lԼͶƹ)yU2"$b/Y'zϧ~x_,1ȁWIDhhIc'&)f!fX$'Yf~s0!Tp|Pl]{f>m] |ݽmk7/~ <ƻؠ]j@ڟon?ob-^ϛ߼z5yq߷o>0,H7eD~e: X8L9Sg?sofˁyY) kێ9@9gϧِ̆|jw@U<{PnfA_:?c3 ar9{K#u`AMeZ i/$'$8wiܨyXMtEUia3uU1\nBk)'<:G1wBcCȣ 5qnܑ:4\R8ĸ|VfOo"E:;qȞKU-Sd^P̉ ӹ(?V: {!&g#5뗤^*>w0oTېxymk719y<ak(W_r#^*jHv.g3zKU¨N)72\ Z\}lx%ؐa$r8ã"o<-BQU~0_/\Č*s[[x*ts'mŠ]NYTTʔT.!اDJ3Qq>&W/BpK=M/?{`a]lz4IëiQ(1={rtL@v< 9dUv#墛񛒉 s,xgx֑6; #D:q.tDtư# `Up*Y 3We(wex-4딉cMl/d׈Ldi >DX9V< ,Nf]8D|,Ì pJA Ȉ׈+tqA׾.O5YtJx -+LqJNd&!^T#B2k +҃#)IO_ hbf$d"'fDz".4%zSԏ"C=JSBr* qjZ?5]EfS448;cNlIvK ՟ŹnߋgKMISh֮4VM0|sVW$^q@p{ 4дڢ[p,rT Fi(4 g$i=ZIWiw)%b Gk#:׹A^M3]kL!{KaAFrҼ1je-Lxj"l9! >U3V{l{ƴ[h'؉@:X&5؎>J^$EU2BF+4SkVԾ\in "vwo/bb**Qd<1=Dm3e\]MQ'GCu(|q y-u@.D-Zܞhe-0ǏVI w]-Y6…0W\H5ݵ". #9Zb$k7΅t4R7EM;1O A;s7)GAg^wc Juj+Ct"<#H 4Cy0|Bzͷf¼ŌBL5_6ႈnxOzBMAN9yc\dI3K),i T-7B}{n1Uiy`#o,+Ѭ$n ªӭmm)%#)2#&&`_/MMvTM6Gwmm'g3Un'sR, SLRrT64Ep.JئD=_7F#a26M ʔ!z}B6/Ԟ[[5dۋ6ENG7*R@W]oR Ϲ@5. i ʗ)&EA_N3 8U5fF77Or5 :q^BKy1<vbNvf: \~"؊_# lww;l G 3"hb!ܙ v~D{mX QGMNrNmr}łewz;-Tk\aH.R,gTb2F+t8wz2hce1%;HytR q%]\R݈h)u}<+Mk7!i{k|{9~kHwyEuwˁE|PX՝&Jw+F5 }- qh% ى&{<⢰]6I=H5҅zg2պ m/*jBxpzFrEHKb^u*e\f/FNg&Ji1H!3-P8'7ߤV5v1cTA5k8e-bsqBt2NeS]+a'Qv&KTE)-c}fZJ[ ΘlnQך^_:e-Ef³ޣխк  6_lcWֽTs9V0/*:%mo{39؋D . C&tE9mxaіl4Ť2.>,;v֬\3%TǶ@= ufk38V)~8.e>p^Uy>R֠=g\W6g,7BBEq!T({*oLFXwxY$!$w`7z?/Vw`qև;%* N.θG! ?y3}{e2ցO~v%5I_"w7L&?{:x?8,'_~mƷpe%&iL:Ƥʥ[gZ7㸃"{;9ÚvL~M \ֺ6ZJ ZڵaKNy'zד4 ߍoW`g46l|aeu*mխkd<PUa +* A c⎅ȶVO/+dJ$cse&g"|֎brd>]x9 |5Il3`|' ~j`Z Cah50,XNG׉*&%h)L K3'6vF:Rƅiiq/y>؟#βą#fpPu 1+[fLѬm9_ _F #WyŘhV zb ¼Cb{_7 yP7 yba^QlI!# rjfاS'Z 9R8'ZlTIlR&DxrFb_ d^LAbNB ~qqSр26(ՅEQ#Ko?%kZnx2dr|7Kݰ\(Wş'x{r-ۓ7"+5s`ϸQxl0wff 7;vHb3@"6B'qWdV@OCKflF'ryni84f+j<`4s IF=EIϜ58YόUY)G7_J\=% PdvT V͏ Ӟ -K6؂[$3#NZ:i)(#:uУpԫ֧hlƈAT -*Q.kDC8z ép>5<,@AuaW"Uh9'?;\* 8.{G޽p,p;G? 0W7' k897'`)Rߎ_O;c30߹^Bd7 <ߕR)9d}{6l U+H&`vI&l=O ݿ~Tk8 :tUKRf=FZ0h\%>sǤcιp' +q ^0|$SqޕBO,ej KL/7`螗MT j.x!.0r JZ#yU`.gx gj%WEGD8E=u֕Z<he.fXތ Ges7pٝGr`5Ζc[Д,aw^q:-uVqPÒFs.hK:*J4=k9SBA'I=&b*=[|8"%چG6 7fiVМL ^Vl((#zDbF6oPT )ֵ!lBn1PN0 {䌉5-# lD6k`p+ViR"iTD&J˽xc٬$O oYJP攷Fr9I;EZRvI޾ݕ^_CN( Kl0>qEii$J pt'{!C"{,B:xts釼9 ^_ T#'g?'Wf:2.ME[ 4cIie% JXIv7VI8.1yE8N($ETg&[Ƹ߅fqwkpw68$.7դt(^e)eS<ٗ'D4wõϔxbMϐ\a!(֓KJt.Z%w>w L]CbR0q;efqj0k?MdLaԺ$3$XH{) e}a_άnMΌ4#!qH[dB8iVI1\ 0]`7ҚK# Pc;I]YTТQ%whSE7ZI aQW{-20 W*^ D0o9ܵ_.]KCH3UZEAܵ2K,"6Xk ֭Ey;1l)G1|/BYȉiZ.8L(TW=uT {G{_\gޣ. }8 ue)1"[wdz:[wbCz1dTkr|0d]rʚ2"J e]rJF_DU/$8<Co+:\1>E!S4 ce;-3S0A8#c \.K s& 7y*2fHLR(AH$>Tt =.,2?;@uIJTJ<I%2@Xm[>.-`FS:Xッ_\I煍n@"7 Gf܍'h5"շ[\!Qr59P{,OY%g4a#d TVmîacNlZ~v)͖1A -A\~WJ0՚ CIAQ[9JP˕4BvU}pzwh3muޒ$B'y ĭD&[Xc..21kNAeQEq/bf1aགྷGuY! g.hΓbGl?_ Ͻ%9f+=wrG5Un/O+Z2ěn tkk<le3{CeH>SxaWcJ-0ڵ?t%^A‰(00C2:x2uw9ƽmIgEkR,C$;1#. -:.1)C1Մfw>I6DN6x IFB¤Ҭb%CjQ۱~(; xmFKPN6%5h'HhLkr>cOW}e,0?X6cP]h)*c4JS'\,aԚD 3ޒ(K,Θ1w}"™PRLk(m3QKJjֹ l W<򹊺Њ1Ҏ{?EE@Q?W)-GA)\weBJuD"#hJ"&18É:U4%y(hʘoԔuMLgoڼ{hb8pC 'J e8Te욟nŌԯ"RwCjpCW>+(EeԆZEG(E@^bWzΤ~{GQU8pC)5-䣎wqNگs68lyf=J39XKx(֚PhR(\8Ù3.XTZ(iKNw UrWū޷~Pw+qJtA~UK{M Iz"a3X)V2\9):Ⱦ{3ooӛ nTۍ_^ݙx-> kQϖcWxx"Dᨬ(Vavsq_F><%Rf./''%mՉ=C/ Q-X_ps}עKh|'e t6Ykg c$=7 znܰw_:I1 c"F1R,ȤTG8Hgqf`)f}-<r_wu~{z]Ձ# |NFh06/z#Wy uH>]apM5dXtM! gRlRX yJȸ`:" FD2Jɨd !@\Xr`tIݗ޵q$B VC!Js2HBt̄"R ݷzHI×ÞR>-գ|OWTD"94̙֒TT^ APQ3n(9&Kƌ`mDǾ(Jc_D+s4rI(EYp5RݫC]+_u}u}jq`meeH+a6v`fSXi ܒ:Lf/RؗGfZC}LH"uMYK Yut!ZTp^CRR#J,5$)$O"D,͕s\dX+%.7YAEʕ!$ˣ儾h(+ IYjD1'&DI"_RAmUVcdRb,g <: KA]7@ZC]DYo.4Ey rN2$mj`81^3 vD: bS2 L?5I-g=/7 +׏ y!eR`\%ٜ0̬J%T(qJSV4WE8Rĵ$08$ŨI^c^%AEV 2LVYs]Hs f8enr L!JYjI.XHcdIC?QfJAH,q%rp (8 lJ!") KR !r8w N)5ɒ,7fwL&7%qRӺ0Ip%Kc'h=/8TҌrr`ErIeZs*+V"6M%5 \ˊY R4NQ-Xp=9)7RsX Ti@J簌Y]&͂O`HNT-Yp8vo}ι`T+0!$N3 %O1'Na46g$: RbV nO, ?Qi$ cV6"gH;+,:* /]N8u\e`YA*cB qkr1Jqo3+J41˄!X:  ˔jnd4b T8W 1AL!|4ZZ%-fs`Q_#A8m\]"c7~ FU^:Ud@(˴cT=c+tˇϘ JJ . .hWj\<nC+onώ`e H)Ɲ#}4ۻ~7^qۀX2twJiIN.|7n0Ga-=n;ɥFHiL D8נEre3]L"M7PVQdS|b/dǺ GR2JkU+vӨ@<ԦN8?nJX1bT5?ș$*Ό^>jč)Vux` ~wv XGFw<]i쏦WpTsAhZ̥ [ϱ@.B2ޚ)аCτ@~qk#lXac ט[y/ n'@GC#"VWz>,4Ӻi1˝/`haf>ߣ8H4d{(sնҷʮZXU x]5JgVsSemXt>PP\k| 3^:3WQG$qtQ0GWnaW*I *>TRze==i?ĎveG18L֨tQ54ҍD^m;U ^SK< R 슠Iը8G.fpJB$5g j>]H֬]ਤ{>W<80XuYH!: ($JծBny6@K@ŪU"ai9Clֈ(ڞȞo(Zz&ippQQ`Q20Nּa0ðcrHk<,S<y屐SA~[|/eT{@Dk6mDv!_CQKddDh6Th;T= x7ځX TfaX .o*bVPe|0f"ۯ#WnRW~r^ݺz&c']`Nn}\MFa^YcqlfjIp:($;7 ցF!!vj5(8t[[cDSE;DkX/ jwN6 WbɚwX 2Hհ@SRv%pCj9\X+?E (E`()bJ#;A+K#P/K{q0xlb΃dW+E.]L)o/gqB2g_~?]IIK.h3Ս̃a޹&}^,{ņ/>svtcF=c ˤffSX沔͉q7SMkP;W/v>eɢ}.ǵɈbxV:UmsJ˟`Q\ ,iq:Q; vI\'*kPJ;1jGCY"]+*vo\dN.k*;n]Hv׀Ovp1 =1`J7}qQ헯.PEe(Pi՞Yj5+jԴ5d(n]Jڞd6G.l%F9 ?c ·W?y6MgDʣdBct[T>:q,N~7 КÏo_ڼ?\ OrA9?Q/0F>j2 {.Ij wrN6Qkh4:%O b˚w,HMn:J8a>!F)~ì7Mbx(QPQa_ f&}3HTtܘgAWWA͠ z Jp yqJHrԚj*<%ZRe;ț9cDP&eV( DJLE9O2V*+"$h|6C}bgrx-q Tְl{9Rק;M"-k^[$s>Nx=+d.%g-#J!L \Da&!ʔ$3u76q\?;M+t$ ]Tt6AZ󎯃as[ wo;O’׼:b+^H჆׹U+#˟'(|TaMyX.OEt޷Nn`ư~zvL6\$KniLAشEItJ5yfyΖ9'gGwx7#?]$x*` ff;q_x`wu盧g7T 忶g!aRњ 6tx?NVG-.>)m/+BKo)gL ֙2YeREF.)%X F`(RC TB\f8B檇{3YEɹaf"\o{^sL~P"1ǪS5nA4)JhZ sݛQ2np<O_"D3M(8lIҜ$c!=a#-+ ?{*/ڃXVآXZ^75O#mݨ5cо*H"[=Ok^51KUk;iljsn@U^ad!&HHRa6߆?f! ʋMw<̃R̥VfMSQ*<(⨇*#&^g6 H6/wZ+hB۷pS]U* B#.7EZ)Քa,ηEΟk̴, On9EB?."~q_ԍ'.MZukՕתV͞FuHBY D KSFKCTH bx 8Ev.Fa^f犱vb_ty?d OF+ͤxggٕ^{oӳ/Zz2x']#ò'GG-jrrjV"[tXwpm7BMpijzzzu\MFM ("%Ɣ&{o=Tu\nu h W4glz"}Mk 4S/;jXD s+*Px(c x^CB3!22NSH)yF44CZ)f8bD E*YfSrx_Qj!]F@HI 0 )K,BQ·8)v:-Ytr=a%&1yo $>j)U,9ƥgJ `Ih[֥Z2j9MSPY8wUj܈RxRm'Ut`崤wOߘk7=+~ p74uι/֍3=~WSqL#S\w4}owK^\M \؇у3> `Ԭ,?NOnX8ʌ_Tne4<.ǣY,4sX5:I{Ԍ?:~5+:'wI~ |u-gW]n}_~hziY?d]~9K&ZǮ%YMo2H6U3_~8c~5YK{WލWy5Lb<\Y7{PYnA-ϷKA=|WɯK\z6L%^h?gh*>57ٔBډx|) D5sc;`-3߶.|{Mx4-FmV@/ߡ7/fO0Xߙg^zW@⟌_\j愬@~c|a x|)Oo otus0ɿm4IŁ B;>Wz_EgvbT00m^}./h<7A-?~˿`bU|>ؿYP\DZ\ý-7ܝ+9|=,q.2RX=xҠ\_.`?G)̔G7Ia0J~`2<7(et <?S4uj95k<2p9_DsʾENE}i `IT1< zҬ|ML %:tm떛< `4`sQl/a<`3DaX}>W.xbvBm.R2mbԸ.ޝwԈ3!ga~Sd^Z9_{n֙,YA-ŐMKnyԾ[ ݽO}Z*v^ RHAKߞwN/ZwNg0 ]sȁ"ff&ְmP_EpIP (]FCpsb)=Cfvnz2ffȖ..w#%[ҕ &r6d^-H+uo Q4:s27t]2o/9+O,0Y |nbzҡtij'q. (vܔ !W&r_;UpqCE޾5JxILQZR=ֲh?el  {h|9}h%J°^+w %D2z'2*R,9f1ɆG$q`4l,VE;SZre%"(j'%f6/58Br8gN߉q dS3D,bC*)a& 8"Βa3VDʄ9<0+ag1q'IeJMXt)ZK6̈Zj,ԻV9E>1p۹zoH(n.ڰ#SQ'ϗ5-N&ds$*6?% hh K RaMvcnH/t@I϶]c%/jR3%?^` si?6nќXC8u-(kkc c7 Yfgj*L-o޼uU'525Nѝ\θ8rD+;ܧPZEZav\$CF&AC%:@zmW9.oiZ  iz^>YaIGikϖdI{>I[Nԝ htua:gOw8?WUk!hMa?z@V(+;\y&> K<%5N?OY$(gk@ Rj wIly184X+A(櫾Jk4LePIpWcn:;!+/9:/l>%fqd-꿱EVom[9f-g.@!eHE #pLuFVW֪h61LX&IJe4UbPTŷֵmt1~q=03Yk_,ޛC„„hEg4C1m TTuqI8d#.?Vnu801f0ń Xj&JgqdβFT8:e4AtG vHJJmZ$(U'3"Z!'e]Q5p..qQGZNf!ePQĆ)4GQ* HI$c2<6T||-_j8,zέk1n"Dfveo&i%Ryl;'M ]J)uBҠ@%łh V$8U-I,!X$$o$i %8cR^ a(D)3QJ_cHHT)E/σTӷv$v1ao<٧eez^q)4̡x׏Hj,X<}AAQX'|zt Yt96!f!׏5/$(?zao|rsד VfG2񒔃GTq =0޾'="1GZ H.l:GK4#'QGD8NH:M҈,~  S `u =m.%Z#W+>ʸ&,9;'``dIK$A2b"Q4JY_& ҈4LGqbew$)ʗn,`kYU%)XpJF" !*aR0+M`H1bT[G鵍9`r#m;NꎢDNio3|́ځ1Nır#ɻGFg2s1 ӌR2,B"Xb-t2i Ԉ8Q#<$<:zkհ--#oXfYMs %UեƬ#)00KЬ%θb:}tF Se'GJ,/ESw"* dcՉ{Kޤ7Jq0ו;k\@#hD8{pV!Ǩc%鎱n[v qlRk][Nh<ϐkjB`o |D'WNU녙~55IխWgV[ ~LElb|n4)AqDkhقB0UI[jrfv)j2 iY|%Ex޴HݨK7n)QԸϣZ۲)aXݺsU-]Q(Eu3E_j1g(\.^[\ |a3־=ʾS6Q)G?;͕0憱Srjqic( Vui($8T{GvB{[ѺsX-FU<ޯJ 6!*S >uB|jZpM/X琼M-+oHS&WhbM|+ uZtNim}T5bPtȷj\N=m3w4B Zutrէ,ZQҥ93yϾ3%1.IÚg{.(ܚߎ5\p@R¯q m2\Ը@;`hLpb֝Y-OFw:7-tliA쐡糐խϺr1S!o\=ݽhNɂY_3V_y96-pyhm.}}8=.{Q.?/ּ-uc1萚gYH.љ_}TRvFz VAa>«(L>B 47Hs20jc+k e5 uY80 RI͓R]B,۬s*"Υqqٺ߾9cl_0<)zUz!8]Nijh92bbRQvHo𥜓 ֝;bt\@Ƣ˭B{-ZwTqF1!\t!&gF֝j1g!3RNGS.A`7_ ^d!#(n]KA Ł/>+(r+IWA!K(=*(,#eD`H 6B!|?پvؐٚ;{_ ''x?b]Q)og 'x?~*&ֽk9!83hlPB%2IP4NIFg&x[ !:b@fsڱsUn h*km/U@? `YNg?. O;tvٖlzXNmM삾˳t|?p?KIN+GHfzwO+Ca2tn[->-U7#Xwzt/dN`i٫3yu3_-~~j9Ƣg6T~ߦ3鞞WH `ڈAPRiP6 X:ZW;D˜8q>؛$Yk(B&`%pItuˉN)3\kϯ^\ -ŅD4 Slk8I) \1 :S؜s(m.pB!FTjc7&ܣ 瘀aCƌӺC )sEL6,RZ pL]Us$zAvF[zұgsM `zfڡePNAjL, ;=3}}] f%{˗@RHVhN p:E:cATMAJr]#&@S%aT\OFŰ%UR*TI.S.Cj3PiKwJ܃I' c"Z;ufCDPT0㹭12:Ur3@z[6l 18YŰoo|o]A[p^I q1T\4=ZҸ Y VyuFа $e!Nv+wOiO-^TP{[6V•%NXCi (o4\)zK(0N0H2N#}̌ń%8*2@ә˜FQT} /'FkL5AAH~ؗ\s魙~H8fBz+M9TػOfd3`6?3s52I3]=-ISLXAViJ[׋X oa%Lë-HcP&dDʍW5[BV7K7Q_(]}S2xMUC28;0EZYP'%Z嚫TUvWfT.m(s,@UrXT& ٚ&b5Q_(]j`WX6vjUYĉEa]9ZQ`UdT^H2|rrñ'!蠆ҙ;<Tx!Q[ݝ)Mnݒ}7WdW$EvJzB;H?qab[djLxn~2L]q ?(/^@&҂;d]*Sl2ؤa?LݢqMY9ɳ kRfF=}f#Qd {O,8T0i3`Fx]ەgdz &=˱X?Vd|;(={^+Fך{NZ\:|xq%#M8E˒ i"T}GhΛeSgU4wGJA|ݩXj+M͌EJAssWNܙlA*љ=|Xm$WdpSb.H)Ef{{h7W|o_L{M[7ռ0V6\xIT ,e({I3|+LZB6+;$HL4Ilv*PƸMYBNtڣ*pX`iٶQJdZEq:Ӭh˥pHc[HJw& #*T*rL>^uo%)UVm08oh:d4y>\"(vڴπ>%M"G(F 2 m 2Ar?D/{(XS02H6k !l"1k] ɹ8ZT*p!T9[@2v6rBjٙ².jrz'U,WEe [@u9DVF|vRZT(b(&kT F}_ߪ{c"4&}HZXf4scL˔)GiworC ++}P&phmW.boF:\8i9NOIK}! (9 \2Z/9. ا"9&"XՑqDF$Z BV.j0$d\ԛY~tDof qzaMJ`o7UZ L9,|^ B::iy?-<4} OgQ5+x`hFbFXEܒ?fO!ik_}84 |\/}'f)k}FrfO\#\{HRɑ>1 #"k*ǹ1֧UׯӻU HqȮ@5,nqr7NqMO?k Ǫ`0ǎrkދ屦h/& }\`J㜠/O=sUKFTMKsVarǪtF\Q-.AkJUa]k,ڼ$gsN@v.夶5u^ِX`[Q=Û-@:P!N*3 ^x>O"Ì4bX)(-,)rJ%Sh8#`?5>pչK*UĻV'RգArh+yb2Hߌof"7DW/嫺*q"%s&{w ZHU^_ z9r wI0'ՖG%e#3A%I4a*k9#w% qxG^ (il7B\pz*JbD[@0<݅s#Êqس?gUt&37 B>3nם <@4v.IfOK4#O,2]cEJ9acPz*E6f0k74d>' b~r?jDg:Vw4ρ`d >5mGAj[]ݼ*xMPstV Ɲ>ؚjӗCݽ=hlLy7sR" 8-!Pщ蛹uslmB0L QяʾH(T j`\/C kk>gw@c̤:KŚD ܢN)5CL;PMI{ײRT+5:Ԙ=rlпO@Pт僤G "RuS/et; ףJ0uT%'^н#qQn~ZA [Wן*|fUlO#<ᣗ-pfiXwrH ʄ8um%H%J]pW| <1+$|J|J|J|ʶ X8DHEab+JsHb-tT+D-(d|cr2Aj` x5AՉ^>~:` 2~>̮}fү`*O~2VlwYņo3VUƅL_ޯ@$v,C'Nd_.r/|폋gZ,'MaO._1% c5Ci/Q⸔ZfX f&k)98Vb=JFR#jYfbe&,ђri7R2Mh8ud#6:͉VsLJh))V\.\?8J#(!ZX7$XNZ(V;aixqs_븅M7#11H"a (, Q ':Dqlw>p#ո=:l!}B`+E TDB/iZl  _4 `0! 1GX8f N"G:B?$N I4Xf4$?ˆ,rDSmO`*'yfßC;Q 'G|/ذ$^EL ivFd< ;GQ"\"f2 `@9n 4v$-goοA PZ_@Am/avTm|Jmی1ob؊l)ǜ$Ip0n >5%zX|\>7ϏO etݢ˧~(9~y78@a%x,t>ۉÌwo""B8F |Htd|B߹1ߖ&;%NޏF|J`m%X˧on\sޜ_ -Bh\=&߃E!6#8psB%۽XZ%[ىx PNAn.2\6u8Dn|9 {A%4m ՂAnV vT0neN\Ӹ @.]YVeq'aIK.q/Hstq[Xkbu_iʻ,pal]"-{ϣ&Ed_ GOrbwssOVਔʦ9z,dywV/Z; Bfm<:6?[jB=~:'> AɦalzM6=&c\`NxK]:`4v&~ֿ7kտ*_ewQ82w GU.PWnk 5D aO5(!mL4Òi8a%&&Isii;Vo:J&G8(}{\Q훒3IqyHC##Pr-2@5¯y4fҜ1ex,J2I֐s2pPAhwLĂeMMǢ 9f`\;F/sXΦ:B "u<Ġ#@y8T)hbr}ҤsB*eEu6لNb- ««&+T!"<'w'yZ7WĊBkdm)OEG)7 VVޒk b9ra3m/1ݔ4?HNaEN}Ė!3ߟz>W.IA-r>i4a-YiP E~p4PTڳu8ɔĢ;1( (LjEeZg¥Z1Ql3 *͉m5//"(2#qH A TF0D B2XZEdu$h)vU{_ps-ZH F͓Ja"#TJ`dCbRgO.sd-q_F=}gl-: ALB _SLw \:8aVaf=Ř$UFGS0Sؓ,{%H) /&˗̟Nϣ5x҄0x1{`(,VU1qqҏdZW%yI~{K7f0W=уpW_k (ձʯ%va^zuŽLO"O$BYBcU#F%Z7Ŭ갮W Zk{8sj`1ߣ Sti2:W>Յވ|pL嶟.gZ0 siG~ǃ{`:~1 G6QcVso )(ι3)+pS@d^ 2)۸pn;&AJ: K; P+cǍSWƎB0IDk:_\ Dk< vA-^h{HMUW:a|sw[pK#]͗xD@q`Ha+c,5Ƈ1ޭA8<(xx2~hx~77xk 3l^c髸 z_*?|L/yzØME-dF6 Rr(  ksM؎WD˶PP !܂*2ǀ9CWh 9y;h^{ ;OnR)/OϾ-ⷛvTǷC5ti?wqF;wӮ߹sNs7!\m΢58P}NK6]M5pia:_ KdHaEI~= &37J4gbJ8\v >u(#ػqw4 W|7X3"OkmD-!r|;|; 7c>le<I8Κ:;:x=X&Ӷ~wn>___|gDZPקW._AˋTzNWnόE׉'00Ўiw1DӋn=%a4%h; n-SOP D~9ym$#s5.N:~FWNUq[Mf?N3&Ā">>нv@g&ig;N.{'мf~.D F.sfa4ɠpNp6t㊾W-jNbB@9g|]j jhE~wdL߯K?^>]ţ 0+Ԕ]ډlI?p?5:Ojȸ 0iEVRFƼi??4X .뱀D,aHYI?[ 6V *|3W/?,IDw`}r* 0cR_4z9cz-a/%=%=%䌢lNfEbY A(Cm , C9; Φe< s^U wZQ&/1oqy/ՊhU246b2 )5FS?L42 IF-Ḡ> !{CmlAjʤzdNπ -o$iDGJ$N^lCZP&ȨM}kljH QחTvzw0~`^jG\^y\?u8e\"P DA\{!ң$Ja|"JK0;LM~g4N8(s} -uGS}.ƪI7p9>8h?*paJ`<`@<׏>9)wU͍\Uϭ3'$\́u:.O ++BI#h8Ʌdyr'Sq)N̵yAV X菍wS *wg?)q{w#'xsc~m6q3T{o\M67}OЈ]0;fbͫ{iod=ǝfO+V*>k/~?t6,U(qƸm/>{>&޸Voa7՛asCPQD@hgJ(""`bZ2,ցbKM6a}|PQ[8Ǻ3W$ )nrp]?WׯZ zLo=D#%+hyW v^:5\,` LP"@S#aDi` 7 3pЋ!ݍb/|qk!&[?O6,-0 "?P.F.v+?#go'(EHdm|υ!Mϱeo5>\Ʃ.|o8j*Wٯw֜c1*`NBUS%]N& qnV)T^kUu:$K)ˢTy:'خϐ D/e ca9;rDhDȢ(¡"*=fBXG&̔@9HA pN?93aHi,IVZr<ǝA7`ֈH"-ECA>ZS[L1W(p"ȭ~ȕ 'u zq6akFrKbIƄv4,Âj ll-[`X!*N`*Uu2hÂԷeΓTjAA,0'x Za +Y$ $(B?k[1orA)4JAGKn̗AOLC/V I%H{[fi$7ހ-.ZX@iCU4B83h) P\EHȉ-pX/mGjC(*q"-Awfd=ϛ4fL{%6 ~rEÒR`bAHa&Hj/=(%^q<46 DBc0؋~dK*0s)Ayj\;,4%ehm,'uNv5aSƮtW @.{L=aޝam(wΗ#\hS BtiD*}칑 U&o(Ԯ2{vUM)l\i|_#*ؔ 3xG\㶐 r9^1Fg8+ќ>I,*5R ؼro05{S7L'Ow'M߃x<3tNHc7q75״FWiOҴ.UD;4{XdžXŝ֣v,62ΎZfzFJFEwQY oU <U 8b.%2GJh<Y"CӋRbPT [SןytnkmdoISlո2m5Иij=)vQg&ik]铢y yfXUPQICRaU|(@cN(y\PE4Ƭx`Go"9~뉟WNe\Ѫm|?>N1T!& Ur_Έ&&\Wa*<qHB1!%zLNP!!aZh $`/Ö1J& `#$BqEsD"JNb2蟦ZhD`j.$>Y} d*{ET1D4 -MC )S)5,?pҎ,Yi#Jhau53;&kC_-`?~/)) KcX`%m מ+ eP0A,\bˍ1G㑭T!>TIg#xi0-'K4+ka '~'@A >!Uz|Ju+v1|A9a2'RoFnjmM, 8WQ TJ{eȔJYG[֒KJݠ77mWg6Bb݀'N; -W2fͳ?g˃Z5oo_i #Jm4@I9L8DT߮TmmqI N9E6G7Fˬ!3昩w`DA>rEslll2rC)eo33\6L0#:v`\+?8|wYn0>^Rw\H s푾=ʹP8c4&*fߤ*1n O|b(>`׃[|'!.$wg)'F QI^ރgԐjh#Y Iv2Z5cD f֩R * 8 3kjp^,g0P8ERC7YD@i*"D !2AЊJI$A~/`RSqP2?^Zi H_IqNy{gogFbw㸶'wy\=9jEE+GO nwpt\<#%jxۄ. xr.Dl'6@_d,AWьȩ8G+JOz/̾:y*jT1*x&d`:UQBR U ^N>UP.AYB EyS^yo@7f xKM'~05pT8PF>\Q {&O \ r OE|B I1~L\U\sodmݏL8ÌK}8$q@ IA8L!RʣejO?%LSPrM )U8L!F!;5PHaejm )塚"zDɏ CYsy|W| ;<.ȤS)WJgYiq_F\uP Dѐ% 4AHRDIpL:!6yRbwE-:AR|ķ]%gu Ae3rJqu=lHSR-d'Th}Mу 0HCipeb"fE]gp:8C]@o;2, }?/ߔ/ir$vSN7N;G5.tk-ݸ6ЙKM!i34 J{` 2`sKgȆqe "tmU]x|yN 0/.<Ǟ!dٙkͅ'CF 8h=R}~Aț?s$mZNq dl9D- -?_cx$GNzir7n)AO~ '/I;F)H8e끏P QnYs&eVEᲤrei|uo|iWRmRƝTCn,Ђ!mcF?U?~N>\}dRB'I_%|=迈dg p\C,o(&T[}s/ {o&6ZTnq5 VvZ.XiU?nwI!yRpmT+ʵ92YÅMSWm>ڼs(CF D*zt8uf/ajz.30   ^1S!])( %*0njJp"e5EYdWWn1s"lFp'FGcN0عA9AiN(~'H>tURo&|~_ղykKQx&3Ҟ_*iW鳊1ށJ#T?B5U+dGSL}M^*f}fNlBW4NV?F3^Q8f̉xń W|~=%Ccǵ0Ƣsk _`i@^B~0@1j(yqkvpgv>+ pAvd'/6,G[ : uy\9x%`Ԫt`a78ppP?GFLK2:򇣹㈻/285t&r+E$eIJl]JQٟ!RN>BvJ^9 <1ۘlh- e9J*dӆ$~:L׻=*!&b _#)jzZ`5*x5_?(yM;%\އ^ZU^I?t8//yu 7WL`%0R@p$DD0XG"ƒ )8H^apA< F|%4۲L]tD.6ޱZ w emI}gžkpm7 p,YFZ Z, )>zG3S͐e?[t/\ZTa1,l*,J*.d_a% ޛV n!$hv]HFFWXZȦ`E6̲m%h2VT0Éuvn'~RLw=QL2 ݩ}/*U"F,d34[S@(EDq6pJSAׇ9& RzcӐCbF^ e(Owqq4WTz9F[$Ae-:Hdɠ*rXɠ`4FQu6-Cgh֦f. RS ۫BW j|gswhTEhOYz9:|w,bw5M^4AӀJ L!3SBTm!6Z'WZvIU36 ٥ic%!ԧY[˩1D$u0y9  nD+ju>Ү@NcۖCuP Qp)k:r߯%j+{z5xM .眩; ~&`CR~*9RYz̺SmlI?_.0bfdq*zS O2@}jIjaݭ}-j@Ӑ# \FsuT%k(Bc}j.HR x,a }6vgkY"ܜIJ+>rOz9Z^A.GoPCtĴ!<fؘN.%r5XG'pypN@>ж @]Uv-zԬӪ/U}A8!tA#{Qw4K,BF 0SP'otF  ]'clD"hO&{ƣd1(~c08c_ Nܕ4TKNƚ~*$<҃S5b[OY]vU*L{SITyJ1V<8]IʓNwmf+]s87~Ki?ZOHSi4mcGBFƬc䭊Vqdm$b-!7PI6|im9>ǀJ? ~y,C7y1f)Wi'6oꫲp {,wACLL;Dᛴ9d)Zi_g݃TV.(v#_?l,eJN T:,Csz?{FC&Y"bq{} `(Lr%vK4㴺.U9?z0JVIV\ʡLW+%J&Ȍ1#-,$Ss^$'&aiyH*ZԀ^ljq`0zi[sG ۔b'ש T5Rqs8mDHKPgd *S,* ƤyFFbR}Jy?XFMV6:KxРIFdU+Xr.4y"ş+,Ԭܱ hS: QBO5}[:>^bYXDrMM"}G l HIe[ =Md+%/~Tp:u0aBBQ$#"<(I~]D`eR0E=V#悡G7b٪8՞k~]:cQS!F`rdyXb\p?0f翯?7[}%5kD?˸YrZ;琥䄞j"{;GWy| {c;VsП*ŗ92\noXV*MJNlgL&W&8oUwf c=ƚ=~>l(A4dh(gT YqUO+|G9qǙ.XKgx)P" $'$#8,lD0RZtOHڢ-T!׃jRՀp)r&y??? a%E%t,õ39="g DFgϳjty}.d0zNSO}>G'pr֭ 'w<9pcR2Vl"d)QHZPUys` tNhdVǡ|?Ax.tіfNVq ibq4ZB ZQ^Rk7i.L=MQ4<  u8~[*.XUdO{GO!Adf}#P3^BIQTBqD!fQR(АuL .L(#3u7=*$7c!mqUlb4'cbCBYUKs"oSR]RcHTlռfK{!0q8nXl8 {(c{ޡIkՉ->ədc>"`OUZ>Vׇ$,`ΩS}:4 IVI`[(N$b&_/\>^*Y E5:f'2֛;;L-ZGVꈬz_phWjn[ʍ=|~)bsE~ж,lt’yץ#@u<of,,q~ǸOVdքVXlk}ayv/V>TY/fpwK2|F`f7y4 ]f;jz+p׵SCI(YKj.>!og3ʘi Vw:=7/5}<Ђ&j^:(j `FӬyb_paz<nsA(2{vǠ5lŗ'T G,q|[1M?CvɗD[buу_Y;ӵz0]3նq|yA vy9pλӫT+5juqλS26yGSk* Rzv`Gy ._yw̼NvǬ $֯S]wǦMwn'#!W^SZۄWVa&8aa$+Tl A;h??_MOVi)Ph ܐvޅVmMI[`(v}hh|} k-9[C;brJ hӰo-0o9c/3_[yJ:0%gtθz! $і'B%OpJZܩz^Htnm BnQ>3*8 j]Z@YO  Q nm+0Ȗ//JnC9:}BDk&_4»HX'|/A{B+ b>r* u` J/+AcMeQDUDK,7xbcSREfG1RA]ۭnav"z-ytgSv}X敖OY`: 8e?V6|ݮ7gsB)^_w &̷ Ab);r5EapkPf;hтu/QES ;DE RHem@/M ^ʒe1Yk| AVhN>Nns]H BQEДK+ H6i1$WsZg rʒ@F"}(#5ŠHFg'35o='8=9Y1Xbwh( ϜE`U"%Jv)hQmQ,{ ⒋v >Fz@**"8˺ 6ASHXb5EX{q.rJ(@P*/l C45m,Š1=$釼aUB݄$ʪBg.0,wXh֠Λ ;n8;q,<,o8̱jCfWDNu_):39NY[+ج5 Kø➎X4EZ 6<[6qm#;6Q±i>h$ 6؄2XtLhF.YN_@=͆{>պ%X ;vIj2,ȖAژ G*BTLb}˂Yq:^}PjI@;!G %kq|>N͞r܂+[UnL IZV~l?#AJK ;Rr>͏WS[Yj)7t,B[kss|?փr:Gv;hfBj?F QF#67u/__G7 _lmGAi5``b?b_]q5ȟ71>RGױj(YGe1)Dwv7 ?R_ǵ=㲱!B<1Iv)BFigdάXijR:w}0ᙓ MXPHmV$!ٜ k9sJ:J6|&61dVJg1!HrbEQ|p^a6녵d{mֆm4f|u͸.6,!Q7M/-+(lȗFҤ}aF**Ȗ1Xgc%™dCK1;$ˤe;[r` Dʻ%fce8C쵽A Hs %T^ zHG@?8Ӆ^ZzJ̄, a =Nl9Gѽa"7[`g`~`$!<(nReñ&լ$K5jϬ29΢s䊛* F#:N93Żm^ARRzҪl/%@`Q>̣53Mg9}1=C߰i>2#~I;%gONg˫*68ѰzwrbM#D" /ucw3`Fc(rC 9d)oNAl =rqa;>۴c֭ rQ 7x5kPMEDZ[ ݊/.k2DS.I[otT^"BBr65&0[C\)N6Q")QB`l8*R`dY:h)Ē|=8 eEFRApfZ,)wNY!ۖ8cqJl#qBtǵAN4EܺAeiؼy׸ԉҘÏI:h:Ƀt!qk ;(T:M VS$pY'"1~P۔_nP!w I6H+CXE=SxOs ͇ۛ}elVn}sz;l xs8 IsĽy[hqY;9F 9 O>5R`;fh1A{@dkKopO#}zA$=81bo\Nk/Ze{ˡ.Ǎ,zp֓OtQ`BQMRzruE+R 6Xa`b)bᓷfiCHzE6)61+foFD$d#r@HDƟ+ )WooOQ20k?L BYk|3Y(;IC!TLa;`̈́1&RZ5 : ;Q040=,&%k1-8Ji̅@ïu {~& (c?οՔ I*UǟzA8Np6̂Ӂ 1Ebg B[`Rj) JG KwXD( ("΄u܄^Fup~r7w]_8j \oEuvvT/v٫p9Qug<<[L L5Nq_"+XHlY߽}M6h¨Pm>[FO~_`' o/F2\)>޸2/7xؘ |~fyaװRXxEoZgX,] 1; c{!& >KcX [B0l#EW *Fqޤ7Y#U?I?9 -9sڙU؅p}Z0CRO/T~vwsqWDdM* *&g*)\}nٕ-uwң&7X mSr6-:Fws1,8cp5f;Sr#\I`~~,wuㅜ Y,{_ALR>,^uHz8/KIFW}YZpTHN@wib*{=Q6\6%SOt*SE)1i̛y0OIVgF=~?8'>0 FƧ=|j#0 hwMz;Da~;AU}= ˗}e2W788H#iAJڷ"$I 5R2 kGޗ>uJaBQF[-GaVSh/X)2`#`X i&XZ$`=c`<TB("􀬦^yJ3Ђ ۝[2kjT&<.Ir%[#j' L%bBW c6TbayX)i$`MlyeTGJp.7xW٢(v$W@+խ +y?& })j<-uʵA.T)FqYDv.Zz#c W -N -Vp k)ck<$HR"`vc!!) (obS!IZJ=pYHdZ)ZV/+Ƿ7- 6P綸IFI9[ `OZ=G2Kvݺfiɦsnc̐  ՃOOV[ֲm6|ۘ@FAØLì9^<#bhS И୉sέljnؽ$&$Oz2IlQ2:+6;|[uEC'YS/?ZryǠ'^j@"ӯ׷oxOQoP`R<؁Vs81B&$)' 1P5^q@(.6؀Xx+2 Rz}}6|fܷxgNӈ:⽥,Ya1*5 )FQИfע~Y B!>_&ƅYn__DI_Qs7j6_LJOuתX</n^#D$ŵ/1b2*Ӡ\4q[am+Xx HXÉi*&Q70<#urEʐ1ZMg߽}Ẓp]{Y ˄DʂvӚ7V*aT$uk,Y$V60'ߞ~MחA9rQ)1xn+a؋7?~x0_} vXL?<q F~ +*FKhYs0R0n DQnnʓ`d͒;ceR9dɑ"iMI͎np {#㯝Y@(7=$?D{<3_XwECM4^zٻh"iy> ^ZMl^-c39J!np g/a5,3q8AL:2` Qc?>Ì}Iᤃtq1&QMXU#!fJgP]/3?nQ{5plfRJL8P^Rxw'-,k;4 dtx6@:l87sk0I1%!ғ41Mji{S$V4%I,iYS߮kQ[wrfg\8դc5n#'Acғ4VH+#Eq")L> ׬"jb4\2NFhDBh'M%wܫ I!/$ͳjG>8㨣yh5# RheHPϨ7%U^g b|I{s–16fOB"g!{3w}.?̒~40=5S:i]8ɢu6dg c^l6:]V(zsgrDV0!FրCߴAO"|Y~"nιYV>k2D+K8 9S#@9T C}2)+KǺ Ӱ0D:)@(]Oj+ӀA' gN"LB`/<W[FI7;?F.C;x% ޯ*q նC{|AqvP:_y +I@<I)PO-W݈nVR=ξ3;lRZ;aUu#)9(cI1ZS01tnHDz qAT+_M2X⇋AŦ)Ra )(STI$%T(CӡMfRTEJABN9S,ҴFW]Nټh`2V6΀-dʈVZ_U&.Xdؠzg`+bkb;"11 &7ךX Cy4D`'  eJ4BEbQR s>"x#V)!68~(FSP fHvYU)J"{jTIZJ+24S)0Ai ^ ҂#+kmVBcSDh9mXJYk_8ӫA#GZEC%DB!^2ˉu^wDi%BnugW} ˧XeV]Zܳ>#UٻQ( >C\k^ "lHG$>&*s!:caw&|?z< 2Gq`Z}4]ڝ# ?IY|k8U9"M@ğsQrq69"gO5`tfhwm*f>-xWt ]?GM=WoC:ÕQӇcwj˲?%,0{py`w6 e|QM3?FcIH}? r o9t? ` Dmi;sw_TMKs{Js"rM |s=?%&mCB+yN6 2kǥm ]Piyw,tK_`֥'` ]PSAt!pMhpN*& Lw[`Ø{r"纐WYOtwf͉Tk2 X5X3#舎ؗWNA|z 0I.V s|zfu5j2m U9xeM-:?![8T7Z x7"\ ft9jLqo:S_[/FWo޴E-\i~҇"xdY: wւq \XQa! 2&`w߿`L#S`!pz1sfPPiǓe^PvbŊ|g2*f lVC;r)sS&z"ܮaa =C~O>Fecolq,JEP'dA 6ԎT36OF&owк0Ys*CH9Ck6T-2vz!`aayɫɓ 'awޫg_s~jχ7¦+a=~Lb0ps7.oVt-fIٽ kp!'YK^d1 S=-Bz Q EfRChL)TJ?WpKAbPEtbQGIg3 XQ5!!_FT)blwxu΄r);ZcJkuMhn&$TЮA3`|g~2>ɟfE(I+_RC)sAX_RNHsL'*`F?2X"ۏ;l4 o&u#~ӒK[#~lY%kvb=U^/U;i;Љ gS2^n%+ui==`a^kblWr[TYÙp^ݖr>UrpʫV(-Ն~֔j0^5ӄlIxk`_6PUŪcj/ S2F\A V8WC!YRf@ek8Z` 5ub/#yRk>Ѣ/Ъ|هU:m+a|oʥJY Se)$ݳ(WxC|܆bS~21~:l' ~a ܇)p}y ,8keqST"¦V*Ma+E'^H2GZ:w;]m}[׻>讷HbmE0 -*SEUiVmhjw&8U9w2G,4ͷj'·"|>=Ϟ ֧-F}e1 [So z~Ca<}-$wJ*nqu=wxc 3{OOnb!Dn;k#"GH s'^f# aΤM9AZ(%_'Jؾ-pHS~*"u=GC[]g5䌜Qƺ㬱HuV R5clQŝtA+h廏\Qϰ&틊;n2j_TRYPGtVrRY c^$knQ*4Ӧ{rkB7 Un!N'©mID㭢:u?[U9Ql P[-> Wx!;5 {ǩY8dTNnsxm8"zv3lJRZ`05eLJ2PCR QSU`XŽK>܀PD/`Q^Iָ,yɴaq-i8L֫+S`4Y:Nu’ !#`AR#R"+5jï X暰?yƫ $%Wߒo?)]PB|G Oy|zuC I9/}=l!!0|!!t3LU3^`|4h}| bv_B>r)\v27}4ㅅcYӛv_ڔr}yHN%ju~|6Y@Sg@'Z#K4ݯBW (@sIXڣ/Oj}UoOnA}dd"z! *rzY. !=Z-R *Xb:CͧuW23v_ŀ,JU+w<P {5$ X c%b&&[9;\/QlBT#Ẍ́hu1D8-3ʒ? 9p̘ƨ-5*y%~E Zp} X+F M榑=x`-"K9w 0K('&aIzKRPeZ9dtfH9LCp20g5eL!ljә  ~:+3ESC{HS̔"$/"ɏH+A@—i;^jfps^S슰X{^˓t١h֮!KM'2D.!RYoM2 q{X/oGQC2' gK sfQîK4)VePp6uuG䓖\ȹ݃,2 Ȕh2pQR0RV['MŎH!So4u5/x&Gy:(2 J!'+R-xivA,/@ D@=Wm-,V[joՂUǟm=aFwg .o}ꁻs`=цg j &t5sW#=uBh8qC)mFEI#LZQi<O"eC Ӝ Su!!0 ١Ek3YxsxSA*RC뜀Oթ"3*ӓnl(;9jS@Y}s_rƖmS!qfLZߜˬ:gO<1oɿkϟwήO_b'긠$xMO2ɴ.0GWرڠWuS}KZc D%̦TD`,HNo)Yiʍwvh&pNDŽA\ QX>RtwUwRCO So_DoSJeΘJXFLP}@21Cb2bq꟫_T;XˏdնgL/%Ҋ\oNHo4HK@z [@igmش곟\l2Nq-2 sEn1P^+-_-_=/x`Wn3rl"${|tZ= *S$;& &; n-=::6atPYM* #+@సiT)@ލT1UVmjK9_ok@adZ_Ebc.%$:~g fF= F>:%w?S"Ѹ1a i~=B M̑z%nAױ0L 5<l3ܵZ\>'N[3rj7q{O/&n ĽSC9x7Gt\bwh99ˁ19!J$W0!EIL;SrhM t(87%$sӼ4LWofoWg&HB: {@-=^uPFNy>-'E6WZC+k0kz5e4AZvͲ=_MPjS5!XJ(UUF3UXÌEaNEI<$Y~uqgM>ǟM@|r3ڛ@xZo,*f;|yAon+tOGpڴJ })>do3WoϴF!0}ϥnf\^ӿYtRf~NXsRP|!xJ%YO ytRtA t;4FnCҭ|,?O +Jhc4þa|j4dBebW.R u֛o+t FvUDr?-Ђ"dݒv4Pj"ݓ: w;A Bpb1ԡ] )qhBKؙmBNPK:֕3oQ:aܱ:Ysr(_m yWbw9/ XZE0e:F3Ġ,WѴ<}}v'7 O<ŭ|1j ѭV#2,3w;6ϬߐɌ=g /n}|qaRe~g{y}o@㮱Ӥ ,Pcy* pr"c̪(uD$%oLr,RrW²acWB\/\R ϮxO/ BaEOV.ȑjuNRqщ4z2oCG9&U);*ہ4ǡ庣!Hu%GEw+V`r´7־%0;섃!R)aiV!2)ʴtϬkE[f=~~ #`7;߹\gްxz6q~Gf=}j>o>h=0{ŷ?E Flmwɵ&ۋI.ŏ77׷.SLy3΍}~8-?=yWDٗoO>]}CKiVb]O8nz)! aEݺ(泌Ww)jEwٙfϻ"62kdv}}B>֊%;IxH1sZU3jyֻ8Ai1aWқ*݈n [6k]m&S։d?M i00x`f1hy=&. X9#=n*&DC7:Ɩ6^*ݔB9;\u i"lD DAIRYjVJJ&mP~Т 2f,>'N iߤG,t)GtA<U+4م|4V!?RH2ReY0%cl'Le+(mޡx}qDf ivKKiU g˰Dp^;^+YDdb=Ms}rDc~Х WfE 3$_Mlvܐ%j=k@%hKS#2)5%隓"L/<LN4AP.J M`$i㬉)ChB*%me |y 3_gt; L!t(VݙA]+u(%A!շj2)>W ;/})1^5% _'ф-3Gz 7ɻKCtꇆi0(%bĐ|)Pq1i\K:yiTЀwhFNH,"b݃ņH*<ϮJv~f{4}/`(VnYչz閼aK/]G)%J.dw*ܻ޳3x;ƾ g^$FKlWq*lI%_k r)8?Bi?/ynj=v`!_"W}m6tC>ԢPYE?_n4K݀W!P1f|~K~GiTW؅Bi2Vsb+|hՁf |ֿ-x[wH,Yژk$<(M5=8Zj<.L껆JERvWS%ZdDgܠVI)GqLkO$E1Kj[m X*C;<އ ́Os\/KDa%:{u/pN>_p_b`-7wPLG:Ld8Dj}$ee;MR -{Y3-QGaָXvm#bDQ.!QZ `n]˪jNK\k@@>DDN Bejl[\sƬn;f鵟6XxYNҢ bXʽ>~txghhB,A^DuKrIAƨxHL'M>F@Eogݴ\Ⓥɢ54fEcC!1yD6q^L$ŕ'"qS9%=#`mՓfMb%YkMV*yi%5HHF&Tngkl^hdD 9ZRs AF)2Zf48@"Ēfu=CÞ+QEƗU]m!Ыc R',TS$H<*PV2.5kc}n۰>EzC؋wPCpkJ Mt0m'HҠ$:iHZ;AP N%`zΒv5AXß#IYie68PĄ',LE_S@c8͂ȶ Mr:@Kp$e Nl$}tӼ eIM21ڲ{F HU-+TFke0b =4@i 0žs*QT;!%G.ߒBc@'q3R JcZ)%*VI]1&_]0dCPz"CɟetAono?\]৴f0~!JsO|8R(/ߞ|/|Wg_ d%qY#U21/,GLB!OU+2 j'ìGh09#TQB-̴NCr C@I..堂.|&r9XiYg>{BwKDkzJ@ xP&) j# q,ōCxw㆗ 9,VKPnv_"cDmWIH‰DQ( WJu$qK5Psi}"n١owh@2ߜ{T3tQJE'ˬا+I)Ohsu _+qNAsUcǫbr"/$mn?#\Z\ϴ]#API$'&-Nzqb'Ьzї|JhI07:QhH"˶DNMU4y];ݛ?(cPF/SDD<(lF΂#>8P_yKl]WY)}20/{Wܸ_(e!5\]]vlvlXR$y&s[נd( $Hv<55c0xFt7f}MnWyvY:"QFrec@̄_Zy&JUʬYbxKTjOlxVY FI_ٜJrKDfl\F[xf ̇0}FLXx!c 0{䀎uÈAG ?C$s;bOS VB3SPJsLW>Eꢃ:cƈ[źjG(_tg g(bRV\ U/9A{W ^ PI9e8$۰;_""vd!mhUGk94ܔsޔsj|ÁmVEa@m(ldkSx0)(ldjS0)Ԉ)_.mpuMjm8Ũ`?F ]}xVqF1?F[>gV?RQFh땋z^Y8<6:D%{HvF}iS[?Bxk[HtCM\rF_MS9TiѽxM'Z"JՋ=h3R -^)pv:H{9X;5cdd(1 (mj A|$ }>$PnَA dء{T\XTmء∿BO+c'ѽ Ne1,C$}_Rax'(_yo$!}W5^s"o ߱u1@K_d. B}lP$ePCE);n>׭{'Zk0 _d+2ɑw&5K#KANowsW<%3Y.'b9gW)‚IQ͢0qDyuUU&huNWijFY;tHAMj'ߝ$sfD+V pʕɒiX3P#&UBQ@9GϦSgW|1'SFHnC)%)5pH͙RAQ)j]LD&jPL[l ƶ pxxQݍf?{L2_\^TJo#/ۉVQ|/2*w.7hWloI+V0qd^O?|Znw +"T4wIo/[|Z?,hyq&,`xnYjaiJ!()?Vc/Dӑw&|9H01g5I Ե9§ilc_;`$NUŪ z6Q`:#mԢJQlLśJAgf%k!,Q\x\bVoH!1['mAe-S6aA/1CmkDqȌÇ\սQhZ^RH~o f,;@}~*vԠrB+R߃Qj"7K2aAZ {ł >?}\tOz W_MTꊇgaz^o^1՘vh- V!xOnWhw7_}M" s @O@[zբ۵wpǶŦXG:nS#bW?W:갥0%cɂekۢw>[ cwpuY9ufzcQRilvw?V`ǹ:h?k"̟ Lj>$!G h;2M!CH֫BU*T-p0=Nawwiy| stNAP{Y*(v $$9lۜYپg/E2jqYs@1 #:CLpS"EFs& iLLFj 1#%YNAA-4+Ub˧<ۚ^!&=v1t~iHS~\,j2*cndZíul6TTPčn+ƨXf}nc1tT|dOO;ss0H_S |"t4GXͥf;5* 6Ez_&F-U3pYu`h)ݧYEDɳS0_~9.n# įBSQ$b- S9R:tHCBfS,pRW3nyՖp= 5kp \J& MI#jIIXg ՐcEP,,1լ&3+L p3+]W$ӆ2"L3XB^uKFXٕKi48BsCE&lфUf5UV*P<2k@@8\}%<#<XMx&cR%& *z=trͪ@MH.B8bY'&ҔR{]z*TnQ+|}ZWy5?貓+\:FTz۸V8vP}R9 ཪ/(!m8fU3_/ j(a_jAaJY5/6h3.E< A ȶeY1㒕 ~ɝݾNiLӝїvdFƺfR_)+Ym2 C%)<';A#`Nf./tؤg*JgDR ;OSQ87ƛ*@*26(M@FHe1+xIC8"jȍDZZN0i6ZR,eoEoJh!:y-[X˜u ה51xߺC)8xZ`TiZ=п>Z٧! rmYnkDbqݫܮ=^dnt5lF[Z~?1Ż}? ׻XK;?ZQ!)Klҷ?8f;%[\ amv"nrs#36g 6ݬeaNwO|4[o o|]Is..?t^n^}{TrΗciaF·z\^l. 5d-w a~כ^y>MQvNó: >~o^/H!De&w0S9AD$'e~ 0j+?<IUxۣ?x,Wn҅}pxSi96px^)x0ӒKoH%>x+ 52V0j}:F2Q.~GŏQqBaZG>,=Z|wu >_)CM~[24ōoڿN=5#L[MXiks#͖ N@'B[v[_~ߦ[[܀KXv;W^~7;<p~Ӆ|[u61G- a)]JWd',z.I+#e= @PɰK(Q LH "0|r*5Q\Fǹ~݋%Z U"">*\m,_ga䒣L#>㈜$_ aS8J3h_e 0!ALgm(_mQd3Lw o::;6`yEuD1W8.F$,욫κ/ɓ(QM&slҦrc2a%BqrTNĎ\ؤJih.%=8 VuF؜ZKg2qX* [FF BüT 1t `CSb L$<;˰5V qbAKY ~ BP@fo!ν8UcD @8 P2=܀cG7 <`b7HqL8m@-%"龵  z:Ujr3C Q=#LMf32`fFڅ8=ЁD?-v%<c ALMJ 7J!pt| nyj) GeʥBV&#B m]W$AlZ(>~~_=9]2.3-w+gZK%u i.fL1N|J &g-uTܹE:fG^gizDeGNJ>P6\cG䥄"4jӳ?__[27V%'z#t_o/|]Xwד"7 zƑ.'\J#hWMKn|\EM,m!/OٲK%ʋ/-I^,29{x'R1X7pVTg,%/<Uh=Pe_%Z}Pe,w7|ߩ>di=kų%_$ 9s )AGC\b[(>&mS!{nC-ٔ{742v kb |Fmc@.5qmn澱[ 9s )?nݱлbc:h݆<Ԍ޼[8wa!gnmJHPKTnt|x:ݚѸCW^w֬#/އQ7+/'^E:r 6#vٟ][tuW|Ոv0{ȪAs]c=f-Ȩ^lU*FhdT/YyDrZ ei VGwgPyt=߽M Ql<@[-)ħx(V{qh jk+wtjTI ڻ"y<<=~m‰а1yx5}9yzswY=a^ZoVn4enynS,6N'eFFt h90e\(I"!UyZhS0!)Wh´!jς'Z|IU$J ,'e.YM4(4-rLZ D%Қ6-1=)`aXV"ahN%˅iF2Q¾WQ0 * -Z-9gP '"EE%tQȂJbeTɈ DqV5 صԼz\W )Z=AG,9' omexcog?R}Om*jͱz,wt5K~o\ѹ$!B~~ _?JL/[!n?mj Й?7y5? fl=t% 0-%hw2ʩٻK`LJCq<0e_~0 t~ICFL,o榌c$&9*JX~ GJv@3B!w@rk>уFPӁ*T℠lpxs kOp239B `h4/Lθ2f2/"[;|p%J\J0|Y@SiA 1Cϙho?|Np~wGub)gy:[mf0K(ad<4seu/R}֍'jH#1;ꪲ.ޯUcR!v1' ~uSWʪ;]`pic[K _<}}M8$vroSP!'___|q#eu*&T}^ҪuZe܋-#zsj=YuQ`ߺ>nfq:E[?XhjVۧϭ!D 6&Gײ{s/zRGe`a?F(Qvpt5 N =T%%FIZgȴ*BFci"idj<́grRcn*9RjNNG VhXr.ؘhZ@jzӼG5aoEb4+d.BwN^q ^O ND)E/{GNL4B~@8óȍ 7vr-V pRIViɉ)ޢ,dR`~B2LyAJ>UaD 0V6`:L=ߙYsȅ#JmO8FlOSYiLjSLJ v#{(U3h{[#Tw%# HqSJUm< UzŇ҃6>A^\ѻ6B[1_]ew޵X/FF r+0%v,f] iN vWauweZg1I1.H>\wqZoErõ"cĎP0t4Ći0Q"}.L#] ^^Z H/F"t?k#¸VT}Eq hS l0:B`鐟L *攥? p .2|2JýT*ƥE*v4 OOg]PKC:Qho Bs4͡d/' s)_%-b]7m78V0F1: SCyWWJ;HZ_lr>wgR>=|<anfd=I&H{y5!54)[\kh5-F[K-J~zOjp/=|L\6UQke֯6+V4S[z5}z̖ծx'/&W<<S"ß|.k+a7@?cnU^:R;cT}ƆyCy4Vڃڞ]ƦwuZ5Νr363z:7\vd!gn!6vw3Ɖ3nN3x %nL-wa!gn!6VzOj㌄[(>&mc'?9i]{Ǟ-ٔ1wZm@>wpu$ u=[ 9smmSRh?>f%Di>393e,J2#)l Hu(/|nV,ڌ Z"nն[YaʭѢO n^װEpݚMyW=Ωe~.(hu5;:ab?[y{!_C8NRSoԮ$>N7eBwex|jnHGP}нDڇQW3| #Bly?O-9"21\Е Xlo@ 6|k2O)%)9ec ؍/[[KqwbyDK+ST3Jefu <\[ԍkQ7֢n\r{AjS(1 Ҕi)$%SV0gH!@A&Rz }PP/Ї6[L/q>ć+=Fh)h [hpJ?L'WSKpoXLWIWӖuH+sle _ol }>xtZ4\OWWO$݆UjO߽Z55\|\nA} ]H! g_3wSaOG? !o%oVyX=^o ,}4WH31T\}}Wknnp,7At2eLEZHCyxzXZ‰ Ԑ1yx5}9yzswY=Fެ9bK'I`s9c)9G7p )We2e@R/.2?h9\8)$K)+JLLBZp)BS/ y^ʂ #DRCKz%N -|Q$IR< Qb %aI"(+@ve2⢑39/V_5X3Tv5Օ x;d']] 9eSt3 θ%ITL1;Q(m^gnv4`QU⋙yif\ua %xء kH12MO$Np1c:I')(g,v8LglI#{q #v{1$H+ӌb^kQbspili?YIpStDɞ*@.˟&zFQy^k |g}Mcuؓ6xjN R3M~>3#wm͍r*+v2T9$/R0ķX$mPM٤ D٪\|׍FwmWp܆vJXةC'V3qY_:9]`LBnjل/kUPX l*l&i0fS˜Bl0Z-]Lyav 8H}ktkѢD$Cbc>>I&dlh[iw@.B! oTBOsۯkL3CFL@&=W]IicWc͘UTjy,F |f[34WOd.% ] *堆wGc?LWFA_/5Fs:5 6jVU(S +Әo3- +'!wY5plƿEݲSV_q&4̱#\m{3Z!FwDhU1{mѠ`U^w臠\M'8Od< GO1 kx-@apΰ'Xk}uF d8Bk )rR}I>2Q-H_G4sWDZj5&[b`XO|~ש6&9Xpa;]$Ymf/OҕKE$#B, ̡^$FwrG=I}ՙ@bi֒ !׹xD+~~AYD.?uBHbC;p "qGɴ.ط۵8d\Ar6*Wa(%Fs{P QqؠrySXjM>V䟂bc P>-i/ac{)srMk6Zu{ӕRyF^5s3γM@gv0kPHxG>@N@c(͇4/Pzh?=v]g;^Δq_2!͛uSQ[)XւZ#[.nN`Flǽ҆ '7o>Mdh3l}wEr;noߢ(⨔PJC'DnHαP+&akwԆ5vr17,ۿ_߽=8Jkp\zz:Z(сEõYF&( & X̃WV!+@8D:O:|RN_/7:s|Y^!0mW .~w&x'?i E_Mn,bn,ͪi)`TAXUչ f=:`@ȚH)H0ɿ|(ehD#;baڤDGeb<5J7lgQ9Y|I?Y|Γ?y=ڍ5xLH˘Heef _V7i˳PI_0Y=6C\wcB)u8A㉞GKɜLAkLb,m/R`.qם&~2'dJо{{tnLP)2C\kmG[ݾ-)C pni B (aQ{ZzM[&(!fD'23|˻,.#i[4W\mNPf ToongѻEvYջmN>4A VIp 1:.;eѪPJ0޴m]'ul,Vf{s}9w΋/IJfXc_fEZ==qPM[MJO?Xd)f;*!~<9#i.RSFJRdo,EǯWJ>=%|Z;alfe~b|$ D~zrv7@jIF?nAop9KQ(=T\5$%Vt]!.֐P:o1|_iA^BLYBI&QZ4$wV eDٕY*&2,S9=ᝒL/ˤ܎ jގ(\8WXY(1,GZ>,m31LN὾pgxϲhI)GC9ISt,he^|' jzqMgP/|cɺ@,]oij g+ .f"FU[,,hDAJp1wXX򟥂BAs^ƛ+ lBԁy#I8 QmX'ԝAsUf!+;AV9i!DKf.kTw1]!9E36+ a]=9o%NҭŵkWfN tgІf%y>Xȸ@\ZG22%BKiHh@Ƌ`NN|w(@ W ilLeʑ!2]dF[4pHǐʎLnXv)T.V 75``m.&tX.C ؼ(@l At] ZCE54ՌLi53N2IC*bmӇd"d&R&g 8YaR`j3w jiӫD(n}Z#o0,0l7 JA(hD*';.h1FIXNOACz:A36atAd l-h7 ʅB|!d#ɅCvPr"Si6~2)=I0nü ey/i-N30"-O G5^nAel'̷p%;`ɮj4֯~ZRB+-RVcЪ\q⳿=TӼR,ֶr X[jRd=դX90U!z0Bl 3l0lZ3Ha1C'ŗ;JCx9x5ږ)O;gWo/~?/3녿ZNKv9'-WQ;O >[|_me{wjX6A(NxT/~$( I(b:e+;a|<7S%> >uĀUٝ[sg ~>8Ko:ǒBGqpw'xllPLcR6sѧZjҧz{ ׍7T%/tK e' y"I@o{V,-jN9h-Sk7ݺ[ELi@J'>X zTf~O = i#zz㱨z2 NݢkpҏCsL>qU'a˗[e7˭i0[6X6Hvoxt0q&2ELgz=dY+Ԃ4=kMtxsl<= KUXY\^*gNՇEv Y+{.@J-*~vϖp_@nM6|W#poww+ ;#ǀwk~Av8Ov;S4N  u}.AZc dˇj{7Ԉ`=ۡoO/A;ξrUb=#'Aו{]Bҩeh7k+.]9_Ђ>2[2{^xSۦחOՃ2[|:>x9zszYM&ԓ8YGQ_X)CF:1,",dB9U3N~i^J<Naqp!05aV.v 0s#*hi|bAZE(e<yf~UXI0v  TeUArD`%J&Lsr y̝t5,ZF֋64s<ТqQ],zAUcQo%HXBXfP;'K|hF bqI׳Z:7?nGwz~[H&<;~H[&4'JY]!O>;]|XYj`kDB!&XFA}ʍ@T~D/䮂C60XͣyquraKQ^!oOU\=j))K(6jD! ][jc,pZ srr!i,6ʶ1u|4}/pSݓZs/y˧7z2z3nQ37Ǻ~[r ⸸C&y7@q+J.]YsǑ+xz>Io8Z1HIOv0$a 43 %9߷g)YӝeUVfVڼEn=)-T[^)ݭ.wIȝobu`aj6felbm׺dիL˝5 :,&gʸIx&[?lS)9VAVdԋ 1iNJFi# G(6qX#4yVC{ Ac9:!+/0ki()=q!wDSȆ]9J`'ʨ&)@Z3t]rjryzlΗѕL | O(9)WWޣZP&[݇eŝٱ] b-0vjwl"HDt;8 94$꼽Z99syݜg$ݼKŇ0^~5u/s Mep7>f nF6bZFᩣ^Msfsr@[϶ inniJٺ~ը<{ r2z#S$Sᄚ `v`.8T߹bYnwۻqOm!8Gsደj}*%+ yS \_)z˪`kDX[Z$ zb~9MY:ɺ9 sEF[ǣ80e]0Fݺ,wW >+x5ݾGJ J &!~⼝pc4rIڕHr GƼGGg04;u|+l +a(/zk)ׂPzuv B-EtLtTzp{vK81W_]߿ |vHl`Bolf'~FYK粵*z% >z2~JQ  P; ?y18햓jLm08)-$S-5-^Tc"Pzbwe(J1*xFe˕XC)b8 qt1ru$me6vKAD 1 iD5Qvc{$28!Z09&"pmؗe"sӒC=YK $,ǤT!Ј Ah!!cJE(Қ֎|`t3ٻhjt@DWܐ.Y3t)%҅JNmfVΙf&H/̄犨 5D >LǠc}%vۍsqw`! Kum nomuqgͺy_mCACl5VHb'6swd8 PɌED' E@q @R%Kb  8+3wo3SE亪X-WVPE(B>yM,3<&q3s!}sTlU*nbv$L}MQp N1Aɦz5;z ykC2!`_.jաL2[R)$G}ud$ʎpnel-{PW"Yͣp+'}vpJ;yn1QǛkwp;ؒxjK%!iPi0mdPq_"$uqGw"7+.o.e Þ-M'Dd(4EYYTogeYul@O# `ZFY8:eP uZ+-w \(eRpØsɜ/Ppayub۫+bZS9qjݪŧg_R_^}]Y/V.<JL>ϙѱŠ( D9.k 8a6"(:RjxO*v %Jpnt3Mb&x5RJpp{}`>Աf HfRF4ca\K!ϩ{Py@+ݾB^ iAio>~|M4 =3Ȝ;,byM|*ƽ^ݚD3A_QVtIDwg8}oK5 rꭑ, :i *a. lAx AY5NA@öpxi'u!Mk|Z=9-:Ny8`H'B+0t|ϙS9^xP*HRR*<&X+GFYHxL΀@ 6mL%? Hwv[EDmbmV?&=,&ug|f08y:r&˛ŋI)<'~ :7NycyEzwEg6m:{)\R$>gV ӺϬ(A'o㿗 (_sʆG A$oQ;S!}@@vTG<Tdt,4K2y\R Db1*xj*`Ђ1K`j%J!˃Jq%b[eg |j%B$1໊]A8jqqeGr!Ϲ1zaxإ7("Z=\*OD1KdrR K$)dKl4e>B. G38@%D0JzLI?FSjUj {? hآvx\D*&JHT{54Zrheλ7t BFPY=a ǘ‚V}ҿ -0\ZId5cPcbRPxVL0K#Ӕ!Ve&Y[1}ky)w[-gr*Ey-V.,[!O|)2 tcKl@VA)}G#Of"#&3gJ6fsJӡ@G9(v,,8-[uM$gGݎLd&mMTk"uQ+p朧W:q%O+u}ȗ,b_#0Q3A6(nʁ)LxS,UWM7 9ݏy;' 3pD_狡,ny01L}gHc[]oPwT#Ϻb@3Y2+ߪb3Io\ D#'&*x:;}K]l|H-=d4=iA8,O%cC*j I2+h0V75ճW!'#oQLi<`I@EB}LR}GXK# ᐐ}[S1rZx j1[R 4C(#<=iFA\+~_'U" SHm+!.$?ZmLJr \M‡, 8A6Mw^ܫ+bS9JՍXf̭Z|zyvE..qeպX` # > 僗Pzzk(X0PKA:%lVSQӆ*5ne]ێHr_6,U/ l0;@[RꙞ#%DI*$)ʥ~QdDdd:1Q^Q&fZq bbθ R@j\;T`ѹHĊ+*3 +KiaUH yȉ\vxh<3pQJ\iL::V,Ji!$Rj{uuF4ǪTLDqY-_7(TcJ \ \8tUl]E kVF"EAHUW9HMjDz DqP0&5]3^5a|u&7ÐdHCZ1TO԰Ilb6O|~o5g jYGZ'LكR a~9YWW .˽H5nj:47dJ ?jsNy >V_}^ =,'hzw#VS3$0A~$;3 0~fQM囁9قt3D2VX}`[67^PEХ %2A$DR:+_X弶/)-j|c(:1Ccw/ggo?W9ߣ?ޏ>5(AG[RȈGӖa.l¸!s셅?K vs*r>Hq3gs!·FN֬8Q(I}xJ|E .aaLbK82mVkoHRs䘨%KԇGɇ(A`Q&݆xYxiXxiq2 Q~(H}x(N@r(xف0 ;a4@w  U: o:~~-狰ZN:!(v\8-kֽ'[zÏ-ߏw0xC'<.Os s"s] $RxkA AoEBfCj[Kd1 .a3UëU~qI-wjS*q(Vs}wҳy5*-pF%=ߒkPpH":"]śbP̝͊ޢ?z"PxZK/΀6PN1#0u~aҿ'%eRݸUs1w խZ"T~>i-.MC{ 4iˌrDpVa FsUSe58"䑱ǐSt0GID@r) Ax`T=R:frJ5ɕU3BA-HvRT!=x39T5%u7C2c+. ZJ|9بZMWT7:Bsp3m!{I2c+m87бc&^sXFF1.u1J^n{W(T<51o]=`i'^3P;YvNWN$w"C?P$:tJq'>VƷ֙WnUk?uF^unCՕ#yą2ryTtVEzC).yhօmBf^h/"J bGrQ,ɈFY).H#}Pa-:vcM ˬHQ'BA0$#UY]Q^(Q f^ s*e@ 9p(h9)0tbg=u {7?*ȇ Iׂy ƽRp0fLLiaGVSZ&u8N$(=pnϨ$(>_PBvBo8 Cnٺ:Y>AWFYI| τX[;΍Gc愒p@N٧5A3x+,3fDJIF3 !i1~壟zGm^{}GQ ̌kqY:g4yF![#H )Gh dpk3 oku5Xo h8CTAzĖz!(jvnt=E2xȜR/H$Md.!#46gY*D_G)TgY˸W0PGS-t <:m Xd>3>0%s *VYC΅2 fJYIg_=tǙ#4 :.ePYrFnO8዇!* \DZ K 6`R)L D(qAz1&v<ƭL ʤB5|$ TWȥ HvP#pz: zm%9nHJJ*yUR>n' 3/%OPWSAKi0i05GN;yDwd4H+۱ADAjO ZR EPfFy3tcRYv]~"ViJTgH/bM:CƔw U6EZ;~h 2cF-b/o?C\ ]7ҿut4ڴ;z0[pjwz!,tqdVB#9@l9vlڻUjv*vMl;.R?^_-+*wo}oI.Ky^O.B/tw F8IΠɾ6ĐkzԹ|GXnFKzu< f(tkwreĄ[$CZi6$ OU.݋VHD:YeʣKt ɸ?)"D:]@ tP\u_pzi= }Ň½v\i-ͪN`F s9AH~β$)n3ĄEb&X ^KUgJ;I %Hh{nřH0υͩ*Crb i:PI,cި#?W`-ͭ(f1 8:Pyű%6}8Fd".~ZeEYΩ&`KHa=n6B(9 9'{}z|X0~q?CiJUXi _|+zrV⍽3/<7-y6zpo]g)λ*$knԴ0] e;k=F hQΉ0Q?=ބ͔Ϳ}œs/#?yX{wwK)RKR6j쥼kd_4")dөԳͭD9I.1L=ͳ1qn̜2(@/6.ݘ; Ђ #Fa_r\h+MP?s`S{ z q8.҂-Yg-..}q6UyOo}&(+1|GrjirOFSM^\|YD$wK;9ݥ[~>ͪS(.ո#to%GPsovr`UK;zJ \XOh[e~&o3 ZiBTyV Am} uXocBwj|٢Nѓ pVN'}2hx>Q`Y)[}lqw?ﶏ [mRZ o㑦 FYZaji(QXh<8͵).Xfɭ&hI":a$Lf9xg9Nţ%d.- h4$Rz"擅w)Ccl%xxWCD%2~" - @1@Z02Chtrcɝl5.>l{eG3<ƣi7Wr3ڢFHMݲMkզ|XjFOuC#UÅ\K>`CMZh.@d) ])ah&!jSE䫋{Emk3Naf^VKUrí2nʄ,Iit:GL=@ cVXY0]RR\va2dIRdI8P"P!C(4 7+g9Vbq*hr{}&r&ER~%7jgkK(lǫ*j`l?xpl@& (,V`+-vLw0 "JnɄoAY竿hig3PBg /ر:4@]E36znnntprp}ljCSٌ͆ikPEX^;[l<^bѤwلR|ܪ #nKC4)8|Gqdq[*1:풝$A[ydvk#Bx6)A$1SPj|,:NJR*md ] '^Fyi3Xq F.TEag2eۅt(x j+0~ugx kIpms~?3edyi1R(Rػ6r$؛Àq8%Lv ĻH E+dhnb2-_Eby0!%cTqC摡[?&1Bvl eܲP݁-75*%ϲ~1bkȦQh?:H5F@nC]!/RmN㽃cLm+ebϱB>,wqbzCEgtZ@XkLLh^[_l%⺫Ė/D2>3 SaY9Ct#dRce)ivep_,51"qi^ۀ4V]HB9JjD}(JQxKXѻQf,CD6.Ш)z>eW}\W3e*U KWeYZwcNT86s*j0AmFύ(Hr)| пX }_oWmie:h BiΒo, NPI0̊I9!*e)*X" יC9J;ű$wRhsB(yJZJDIϕ|H)W:-v(f9p22if|UǿlI 8(`Ԫ[cu;2+zz=W^߂an "blGѧ__;͗x O3[L>s藇'd' ˬlart N%NWİs{ggdcvhFſgKQ9[F \=Lxs3m8s-ţTLBp):f}N8qb ʫPa 0JKC<2V" c# ed5R;de\HBִBLJTbτ5cG`qfJPp % GpM92I0L F]a}HYM ˮ#dZ2D8|2ZݫR6zԅǷ020d9cR@XBZ(蠝#YSu}<̇eeyxYƗe$/8 ء)3\ 8+`G |/Ta$4k(R V*yqRJOg٭}ÿcn -v.nP #ݠR˭T깘lU"Rb,-Jl Km43<5Ώ`=O nuz͹Kɘ |rb()u#ȣwqQl1Jĥ)={L)Z[J2 Q! Ż}(^HX)(6%O{M)Sa^ E?kV; C ɓA'%=Xi )F#]Ԡ35%M,wϲ%tZ΃R8ܞT<Ԉ x.85C9V b]ฯSIX%.{O;ذdX&2Æ#/@Ј9WyA1g%3i4֦hMP&^7ԫjX:Sk/pḟ!圫#(ZmZsA4/AG%Pst$d X[WLI&9X*[E{*}L|y<҂velNdCZɸ[7#S RPˊ:-8ٗR $Q/6To/@:taõD2@lWI+jHK:wil+/D_ބY˛U7!?uxш"JaޟOseijh;іO +̹]%N' E#0m=x+܊L'9T=zk^(Gqzj(@Eпe)&:]2ݐw `'|5q-X)O&^^e2'wzl}v)/lj \p]0o~~J=$S݆RoeP/nbRoN:*9#r`a!O ؛&o Xi)QGq{ C}eAa_}|1b_f~yy{?P~g-FPRW8}, %9dJ( ¹l[%ς1pSBBQ&wZbƣ1L!ӱY>V$FgO0-#V!6+Bj(F,XQAG\Z]kE"iRp Vy? DJH57cMr2z9QT3&ޔ˜IMx!\-'`c>hfӅ*{(X nu!cԶ1Y8+Y RXz I ^D:4r٢Rŏ[[~:<޻E] R]>M5 95ęIjd& e?_ }F}kf(? ܋Q~ q1VW#l^O^{x_Mml J_XqDVPBJ>/8*+h)\yKV[a^:eH{Myd #GRJQJ}"r95;*t$~\pʅօL `c }_:ß a NJiL3E i4 Gd0a=lq20F_EG\9Ġ1ײ3a8g8]PB#D 0kvwѹxPX|to3–QQk\a:TS<6hEpaRJ׭kq%wx; dZurHܪSTr}\"g25Ɏ=slLG}^vdbr8ApL{#49yiسLPUz uI$8>-cƈP9\cYi-͌TXcCu+9B(A6˽NDK6LTKK-ϚS*t\2㼽 nmRƹ҃ϷT VϲL9aMMO?1W[f1ϊ`sñ&L+ӐH`;qu(0-l]{|VJp,ӓxP^A36YѦlWOqdT Zi\ѩ\]Vp6WG Cx)h$%dk ȿ~+sr3k=<&Kq/uI/{ŭTE$}h$n+` Tvo[nHP^W-nmw|]CpG=^x,xil9o\O~:17E]Imߪ9\ZFӇ7?y1!Pe:Oobzw<_\d^DAOpgBgv+a xLg xg,}(MDT@йP\v_nwEiΊQ86p~[URu]"Ӟ!.CɾCcvC9#_!)a/XgE/A.'ngd:XTD.%$a?t?# [FE nҫ/-_\"4n[Ob֜~!28'!ΛjwJM\Eu H%By4Dl$y q?7ڣܣs K#0>)T BOg(qDy b9ڜRm=\Pg\y(I*fH+rgi܁[)hsx=b#6^poOL16,NݔVF4>㯂ש)CU2\szZpnDsoư^MU<ɲ’ zFKo<ř 9, I7L0\y1aF2٢UoW)p\ %֣aRhy% MJ e^ _z}a{h`EAx\E O>;͗ȇ|şBş37f6Tڱ-a2}7fG?aL9IɈ:aN8TbIc]CXI%/Zo&X#U"QɁǨh;&)):__Ղp7W~;l:[Kva 1'cAhg;)0L{]zrY n}np!1t3d@{u(wa!_)=7llqbu'F)'*uqLe]iOfQ$/Of֪v A3}`Ƥ i*.]2B"Ԧr#XȠ@7 }XŃ!o< ''{[ Ux.D;UΉ޲ghRQʞ nikt25rB@xX*ɷ$I +zi)Rtoѧ>6Zwh'z %Y11ѱw]MM;gF9OK<1SN9„f "Bi (r|܃e0.w7U1o8^79 mylhL9A^0y#{nWpJqtc2\Zs.܃'k1n%PԟZ'H~_G%Xs~!K<Lbl~H#ģq OB{J0<c h 1!F$&'RU,ƀM \e1FZTM?fMdMԭ.(e q. 9 OQ*p Cc # y^E#B )zrI!;y1@%+_/=3Tly2_<*~6nj}4k49^}/fj+cNϾK"._PŵD$94 %dDF"N$M \Y s7JBN+0L.YAɨN)B"9Ka,EZ-GBƀ`fxq{b az`Lz$*ȢΑ '"P!Ir:2V4wO2X9K f1 Y,VC`ePJ\K`zAKi&o踨uOio w{qS+sW"DuqKca&#}`{i*deP 4m:3U\~g" zNR$pNo6oq$mKmh#F:Єv09aMdU"k0׉b6S1\6P57S ,OAL.%@ZB!S29sҁ!w׊m'tؕ9I}'սqZ0A; % ) 3P*bb an*@)*2k ɕ"19Z BL1LNMeY;vb< ?_v_ uo&~Z S9\}1ͨɭ<؇3'j.MݳyϢ/)$*H@F ~=fK,f[o/#]qC@#V'?/Z-,׼yNO1o3cyM&d8WԠDCV[!ƒ~'\ =dԠp_:>c0gک/2cYכ;w3 75ȄU<׀,l`4lgo AeC(ꀯt-H)Q_ˇ{w╧[޵Y)?O{CEwFo>-U#ZaHa;ǝh8i6CXm֬_ս+ZA4J% czM*AҐ=n i[cÇ})NEA{NS tbUtоS9-[P-L6_}gJceݞj[7^ wvn-}F)еy6W+Eh*SN9iB>3T(*aOE=_ˢ)zMsTYVwd!_zٔ䣯v:n:h0wt|Ļ Wftj8nun]XWnmJ{QxE V6=R±XZ3]h"&51DJsɂ:Vj@j|Q9~ s5͠P ZNoUXAfzxl ±3-(V.I@\(dlx*ldV :6fj/DYub6im?Pu{i5ӚaK챳LЖ?Eu|8;}{wTI_˽ŋC \3J^{,]AXZхٸTq۸m&?Q&r9}xzTOe?PVѡTh{7]p`1.SZw[+ lR zoVEr^ȣȋ B>Da&55RKb(+?ˏE^&扉Wu>"(J3F4|Y6"+gw=\Q1 !5@q9&<~3ӂ2Z;gF}d%NSaw$]"Ύ 2濛b>\E00`&#D9"+)0ՋdBEquh֕hژ)')"ЊdfGW!nF3;x J C(Q vX ؑw]N@hVCm;=꤈&gc(ٌ1zx^72e0看4'vy?M1$tn>ZS̺^27]AB  "u^"`wc=QU vf30HLSnjS10"rQXvg[v!yT#l)&cr.%rl Y@`|5J)# o/F6q6KV;gr n?JUͪdcW9.k'x+G v>.UzcK8^f1[MyABHD=PlI"MP"mTIXږyEvMDx"dz/]u;O(B hљG`}m0l6%O3UvH6LIDzǨ)v GHZӷߋr&-]btկFPLY1dn6 dx~ \J A3td:kIןBP5X.46|h `H`P\kNk.(8TDH=+bs< WYPCt6ɬ@HǴSH(&z&0!@dA An (K R/@@>PAqp-KX"5(ۇ'~VgupwN\.z?+}+rUsg6/}.n~us'XʔK=,= #a0GXR)LbELs.x,`LǗV [+֡Mmp'J.PsCTCX\pSҜ,$PdRT%?#TFV{gev&|Xn }[K-/Mi3J8`3& 3@2@"*F LJ AT{0AxۡZjֆ fb2F@^%BSSƪ3{8OWX\,S|a&nqL6_}g_fVf>ϬWo ~Åߦ:t[f@\-tX\a-N$m3ެ] ;FzNX,LBcuKhgb J)M5jerY_7򕛨M k7yuwt|Ļ `Y@][h{"Zw`DZ1Vfg'T_oٷ񛞏 7O걘P߼_>%^.Oze>UB,áTjk^ VKDv mdRzTab# 7 jFR N^nK[ŶD`(Ja_J{G^(if*_feƨ{9I#ndgjms%*pJs@yƻv j jq Kf;K]$ff{iхo*R34=W{PXfn3oҗ">a.g&U5gV+y >6˟ zϺ|u6mۢ.n`YP<29 Ԙ@ +m@$E{5q-E6 f!VrXiXKT}`U!Q#jCf={Wͺv`n-1/]B8)&7*hUb#RZ#xCNmБyK`9!÷r^Kav!U2S5U,qك`LZ;2OedgW](ʷ~:nOCI fFؐ.e}=991`Qpr3- Rؚl=C(tH?XI4y%q5Q[zœT5_f._y6nqKFq Q3n5๴\3%7l*UㆂzG4FQoXJfE#8]7hz Gw@ 9:gLg݅;CEpx÷u[98uNU޽Co""\h{IgbR2L Ls&/ :`E&8!KC4 ę@R *^'ڌ1`( 84-- 1ABl|'1pyơ{mš[ -9r3ʔ9#KF h+EAhjt| &U}P?Yu<N1Pi"ch'BƥUbf\K F~-G{fs p%!(T@lu;ٱA t철ۚB-p"zx9 cjC)y.wVI&GyLYHFposë'8ىgd =rwW3lvj0H̰h*| irfS\,WW/pޑ(cϯ xƏ=p}plsلV=xx P1!ߺZB&7D!jSy]uWX#Z/Wt @(%f3eU.%sfZB%7}93 1GL/4R+a2NdJRs+cW!Zne-Zv\_[4:^! yIbr ~,P>֛x,B)ǕZūt -i/4ND>p]ϞFeҧ)'7N!r0o8t8|3FN?|ﮁ@w%䳺VVZ fD!B.#&Jײ+9B֥Uٷ+dWz۞e@vl W?N3v8\T<^2ÝrssͱVINѝ,N$C I"&9HT/;X$ѴOBaI (*t͈N=KNi gɗD@u?E450)JP[ryu>ުấJgE5VRqǒ ޠyc7oSRs`-IګiLB yߤѢv{ޒю$EG U , .]:%,:SK1J ց/8!R1;ХE?~X8r*cݒ+ZvjMBukj X /&_"Z4ES.C.nEMzAGW?0~_ǻݾ4Pi7vB<1mw7'pqu\`y޻\y:3}:/v;dǃ[LNv-`TѦ4m?޺J??kJ8׺c<tm\:ko<c6ɛ7;3y󐧵#oy^VNg BU9rcx|}Q[^6 µ‚Q84 G{t:G* ~7omX-nKXSN)C ֣db$yB ʵŌVBS\*,n7-Cs^@]<1sA)8 ^XL@Hk6#FV'b+wtz3 (, X<{[<]j85 O9LQ(U\KVhUsl >ܪ 71-ho0 cĢ 9d"'s)B1"tO-]K(^1p2DBjxNZ5ȍs " aEځ3$5W\Q0JS)rd<6܌Y5*q~pmt{t˜]$c02>r?';ntɷNCbZsѩcQ62cmց]EB*"%»N.H/::\t**C YJ^REV.qqSMÇ1T3Dn_ uxI\3Qށv2g%gerΣ ;1_h}gW.7݃u yժ*KQ৫|rs}IʃFI~YpݞC?UV(Cu&&*p&ceiۨn[UvX)rM"^71Ve/'քfV #^E8^mj%1TY+4Ř6 p }1\bvi^\]ԙZtT)v"ܔB|7-uҎf>]7W"UdSũK1j f.(FTC ONJƁaq%cW2غ=j .ɓЅ٨< l:C>Kc ggqJkۧqgkkU߄P6֪Qi\U:Щa"7#g.8qLp}I8`.GޔmOֆs6 ?F!@2NοN \%%Aڈ櫥s?nݣ-~kټ3.zTҵ8n]v$ok.rs&3GPەö9JL[4wPQm_'u>DqІ;W68uO&8ld,^+o"t}orVӪiUI~GW̊R }<<)<[X.7(x0wp2H<ۀUAnΐjB] ǘol~tTIlT?ܙfl2w$^˦Hft>hh*@Q8"4 X̕#9 RKDm1f$ D2MUTª9;/4YN$2yO`㍘8ӜI, eXXd>8!KCt 4X҂ Iĕ@4֏Fx'u,22~k8Ekd fbWr*\^"t9ܭ˗ZTBpQe`:B"B%JrMdHnlH뎝_De({`?0ER{_ZU7?@{3zձWhgb&xc2$ZSP^7v)[ʌ@)`v>X{xS$ôRұ< NW[?J*bޝ%#0-10uj@FP@i{!nvD(` (`n(1`4%Pe 4:Oy`%CaN=V}(„ñ˙c*;9X!!4{ƔayB1Ep7#r~d!]jn cсzsDmdCjh/c{]u[nTdx*VAq!hNp8eb*;pIF3rB ַv(M &pAg`ʆIQPw.Hx? liweqH02/- b/&bcn0P(m$5@w>4dbU旉đJ#OWk`\N",h0ȧV4=Yx77,^-VI_,ԡWȢf=S8̼r,ZόnCŒo]0%>,-$2Σ!Cua<(8 vP4#53PB;}>GIYvIlBuQ{m#Q:eL]5xoo*"oStLV=ы?N0k:nb4\.' lFSeܵك*14ܮ.ކuZ_lM+JuD 9e$6;0q=47 az/DmQ ^>L:X%]om7P FgvmQ+_9@F VInjBx]^qشw^ތLPu('@RBqR_Ƙ k <^W D#HUMכtJKGJEja3ϨGa: U1P{(Ҥ^<v[:LYRZ\pްJ/uxPQeIH(W[YC%EtkG( n8Dž&>8QR[H]FzۆUdgl x.QpmU1;i@`v+2\I-hťi{l>;1˽#8;|g)09;|ωxg2쌒7VlGwF}O*35*'  KA3J3~az=+sIawc&C-xl\ZuF~1G ,lViO0YiiS^j= ɦswऎǃL|z- [q%=JRLZ: K W<ݸ)40[rJ>ozfԛW! @:D:d}at~:,sO aC#(7K7Fxa4Ry.O JVeFWhdHi1pa &Dx.ی1F3(Bǧxf+hQ8&Kf/Zv[v$L/w@b\h ` 6p^*e\EUdvF:K+dE誵5D +I[Y -*;^ D՜Yiciz}P7f&LL TkbGۿȁ 6KqM~@H(>cx>=bjҌP}ailV7~d×thWNqRKj9{@ZKgDqOR#"Oo6& ΰ{Yl!bjuLEa#pknIa~s\iح w>]7aGyܔo1#?.g[f <*:o63wYEzdMGmYi7~= [1|'y.ZSѭK|VA~#'plfVNiE>աݜs+ѹQѭ)FNt;.tndѭy.zOj/IEN?\*zϧn>[:Y̚Əݻ{yu}UZ}U<䅻hOu%:GoE7;FR R\9thS#Fnlѭy.ZS&܎ni`>at+ uJNFhѭ?1!/E7|J%l293_2d 8RfTbco%2{LTewUQSɻQE;R.9kSvxV5f00!Ɯ7lUYy[t3iUլ5u eaSЃJ <rݴfLgAxg+>gҮOyC߷֭u8SMcfz7hkm#G/E+,9@> 2sndg.>mmdY#fߏlq[f8Q]U?X,փ01B _#TԈJӸ'`>g"s 3]xRآH IR;!i9#cg2GX h`A,"#UPK)DqF&Ab I/\d605BR2!h18vFPLE+I0澚ʻMlr>d<7r, *)Y1p,c{4n_l.|orK!${ԒǷSK5?,i &;K{q"|>/akWP)9~g欞^sssVdGT!.NŞ-2w9bx%uSߋ#qbB;5ZҴvyl}J)X[-%Y`L~icS/ʹ8Dƌ# 0)2Rg pu:n#YpmD͘kЋ4}?>ņ3w[; [ɥO?|F^OU+r.dfĹďWvvFOժH5RO˨ *N "+Xl.޽3j_Mo$FGH,f>wAGHfa$ [T䁟"@ ;`R NV:9,%)@)4=ߊn5"Y2^f*a2(N;`!}e!MT;Pyf|T F|ڃ4kSw#"v"+4j hW/_l?Q9dY}zaWY''—G\0{׌v W|r_s~;_i򫗌aQ8%qE *f!+^x=+yyVp AЀHvU6+\D޼ `Z\fA/\(dc;Ap Z=AXO(}!j6pSK0E}lw+\Di^“B8Ƭ0[ 5%1jBX3 Gl)x?vM?.~aS:pZ?QѹHXO~uFןO&ńrd41g)en518ЀiQ[ONZ%MT1$241%h, gę 0թ4|JFT)ZhI@Iq;A0.LiEu<ՒGR͐Qt28 CV n1`L4/㊋k;1 m%JX$wf1vT3 d# Xv`4`a %m̸@qQY8K 03;iCx'IAX+AވKPqZA !r68ü RFן&Q ۔y$qP%2)': ܤdRMF*O ߮[~LwFSc kyU`[E5vV/㿥8bq_~~%p1KR_7>f#+U.~ˋ0/kUn;C(R=~⚀QTsLp"D/x)I,gˋ5Mq׎+ &pxYT08εˋzi7&y֠t d*LH9)Xym~{S,[Eې1;&fʃ0 5 U4pRI c0 G 8iqEݥ "Zp3ϑB2KKHax<bB;0Nf Al$;f}6OHF)/UOsUxBB.ۡu" xt@Ʊ\=x[u[}qՠ4-S^Ξ~m<-&?+s/m&~U::::RM!C7ӪGuO_xO"?wDW <㩻HՍRUeUc\0E3SayuEVKawڋV2TW-ezS=D20g "u9FEizŨ('>C1sT{.<>ng㭛.d=㫟k}ڠűr~6+~6Gq7ٴ@T'߆ڋsI+R1b]WxkA^#.l)̻V*SS6`u'et_?_,eWqAK{8I{bزb+%@[C聮#g+)gw`D{hCd{ۭe7 ]&ys$g˄gF J F_M>?nGaXr:6G38^ǖ>?ZٻoxŐ]= k`ùE|VWyuz8JvnDN=~{ʷBavj km-rKLJ{7mzHe+ڷjt3y9  s~K #"[?}2L(5tk>G/shZCyK*a|ovכ8%-15>Y؆6LSAXN4 ;`l`yX C蔠`AB~v8D0GPjGuIhwwVwei` a+chKE$svBw?C}`ۛ[#o.wnS9 y'P ]V6Q­!~ճk#'AMwByr(g=mIJy+=]SjrvOڵvO;EJuT~LxJ*TKWLmǟQqgCƠ+^ pvwWfu}\mݭYI~JUz_Q6;+uʃ%K[{yzELWz(Ík(_ugqxtpuda[>Lw+ vJ+HPqz,$uJO;xuoq]) bŽ3sH ,L8N]DYXʑώ;rcd{Jmׄxji[c˨W4K&?Cj Nv/O&us"ɭ8 FbK,с^w9'Z9R2'k0'QGy\@q~n3g6ze::6do!މ&[boDJe UY:-(c:f%/KwN'qқڂ_c/C o5]m h"=+ 0/%VOgV фR/قNLh|^}q4)|)k8;[_x8FDcPD KHTi"Ȥ I{W ZQ 11"ɕ(ܝ3ӹ~VL_]kܾg߅B򪘞,5#w7~ KQ>}~|_7ɪt/}'9>#BN@YAѢ.ׁ˓%q!B^|z@Iy8?^yZ*B y.Auy8}ʀ}X+_ze{XBG;$J>ՒEމl7&wXjX-oxAʃV82戞e/8b:UB>|PUN :A&NTXt;g<|P^9a'V>NE*NS[gw_W (ϋ&5U*.]ώ%擏Rj.kcRwSSҞ׫7/Mo:z2=~o?>hqv>\=^z}o@ڥߎްsbyFNs\ :DYwVx~};ȷ/s^ >Uۣl1lt. 5P@sFN\nhw#}ĵgM_`#O}n)$xwZ付ONDMHʹKR_)i$ B¦1B, *__nĽҡ&@E/6*~3K6k4[E7+p||LQm:]{6}Ykk>D;½1=CIڸlvݍw5)U[9pMJp+Sܕ0=w[%}xވi^ٽ}A<;Ǖ{Β~[g u^+0>@<>Ylux@Rmx Kmu\MgfCW6t/xss E-{г!V?΍n3F=P ]/:d]\\SI_yzVSG^)ExT9gv5y^aU;[l}dIYZ6 aal?ul@(\qOlԍ9xFk7Fy;Զeﳬ,H!3q;昗yw4c)xv/?/YXA\;kowp$6Q _]'궁=4v@RO7%R?]c'Z's[(n^hfMrS'9gM5 O~)}8x#ps+6eqEf c@͟:reH|ʷFB*^GG( _G̶@* 3޽#EX 'ȈxAh6^n1aa}18!B'Ҕd_JTLExDrR x?w<@jIvO=Ex q\6YM#dH)O7Ra$RGHt0݁q[ll7")c $K^(m*mJ"ft**9#TΖR!hQMIfdMt w.iEؐ<!Y P6&+3!P9 {I^4-B-w/}sxj:?y:Hn-CjsO,ǟW̍ w2y}7_|y ;kxt8~p:;iwp9mN>C-Z^bVh]p%PnlZ|彝˥k1vK&'>Jc&wOqF6(X A=a__{zٍh`kI3EH6_GX4/oD)w;lj}2E3R"u#Ҡ:޾T&j{LޘvC$Qe-Ϝ+h]>URVbǩ;R։8 U)r;i@^AH*[֟97e3 tzb_8;crנOdIfsI$U/GȧDd 2W ^Xkԅ&~Y3  0uyW|cM3|:)/H++/:)f piʷ2/9_Kߏ痃q{ؕkϕ@rKj*GˑmH@q(T,Oe1N`[O E)1+COd|2 Vh#U(ViE>bޜ%nKBAy|i_a!@)pPZf"6QP y"i_z9{|xribrХx7(\ IEãy$?'j^0i<埁[Զ!QZy%=o(U娷ovz0[R(Z1!%s_.|1 Cܽ0͍ti|R/2Xa. ȍuf ,_ CO|JFtYR:ҍqՑ%؊#WI>H7^pIePo8tm8WV9(7p,2D{fPTsEucamjkUoO`HPP]Y -c`A{'q^}cTx4$rtr%$WH$gt)wۇ5%ZE nc0#+0j|[o9r8nE]z:_J{pAiel&&Q z(GP>ph>t<7σk'G G@P<:F6 _ܟÑw$@v5#Pn%g+]CuwJ;䰹Lfh2 I2Tܜ-3]C^s9A{<-S;?o lAzJ vz-㇚H ~0o$ԉjȞsxLNJGt]Nv5̲# c* i`m7*-oo~~z=:i16TB+ZC.]r@_$&iiunj91"W#L|Cxb#M VJՑj` +S;,$$!&Y&|^dBea36Xa$aFɎs7珨6jQu3P*y*ω]TSmQz(OQW pYz:2[dFBy"ρV< BKk9\O5W)/RR0H>JJkhIFQ9i|^H\FHO`n!<5|sV(c99ZjrRF vXmwQ}NՆv8mMCj<8xQ4>6*^ .zg} Rk IRX(uB.LBSVe[x" aKlʆqL]TSm)zzSbP#}zg>_߬9qƅfṑa0uȍJpE9Y"±B[Q@j-b(?dK)8;59)- q(lx(E:ՖsѣQ :J<'JAǡVzJE*%PuS}ZiCJyX; J}(ҒjG:JC0Aߚˣ@0q(- Q "b|ωRq(-iz=JOqEseo(PTJK=JOE11D|j+?8mŮ>}/Ҹ]}{az6JC,KWGRPZRmEyғDi/)T۪#s-p%nW<#oq iPc6WFQݾmBa Z~noőo_t#5N YzV c%0"2-À2N1`1.дh„UYаT-9u,&o&ZZff_߼N.of<~y Z'Xq_z#/[NV!.%D>w>QkQ[rkp?캤dE9*igm-M &i6V\`q %F,`XQ,P%V* f,*UlwL`j2z}y¹ p:ȓ`' Im;txQ$ H2Ubcƺz#d\/ uqxL$ө,LGܨa/W XB!a8G=X E [;KkІG!^ynjr^`SsАRW$P>)6{1#bYnS7sĉO&][W ʠ6&?7fB|Q\6P=g* B( L,69E& >j&ymY _K{vmk ջH4hu65ċ|ZpZn^=JQ $4DjtRw 4}ls.ssYYg7m'T%HU\*}b޳yuJ.:RgP;bﳴ A?~ɴ.W2z?nnq[P&ޏy&UTe="~5kA1k ⁴' |{$*F,V A{̈́N [d0U8EZ"hL ySqh>bBҠjldMee/,0.|ox3wq?s~fb.:V 2N'<~UYm^~Ffdy Уildu~?l ॥idž,|_gQ&y0LzěEL߼J)}T?֤j"jFPⳔ* x+PLC(t&Q44^3޺Bpi*9EHU2O EI%"\?^̧iY.}rAq}Z3 R!CY+=+x9uWߏBMCNX!lTɄlɦ)m\ W/-3#mB˓t6f|=]JiX 7Kww}MC6 hEΙA.o]Jw㠠BI6] (&qI5tY;RFs+__IHNA:RY2ՊG]Xk=Ʃu8eœ\۰;oy.=sP >Йy?}F$+{ӑ>z RZVrZ:G>z0e=s J(5#~K3Ba"k$(6Jk9HEkjQݔTM/A=4/4/4/4/Ь4\*ÍRF1'=f,(ei*h Eq.46}:x$H 䧪lm夔䟋<4}"/KHs˹o,*8JCyJ= fS9Yhi 2=dÁ0) HhZ9n1Wעd|ոzX4޶sy2|tFt.>cdS%F4Gq#E/w]Rx?R/x$W*<-H9qƐo`fHtK5 "tll$70F4StL%C|9g.g@y *YCV2t [I 0(TFL"vKLc"ϖHVV "tHjvk Hf%R”yt?6`.rCK_ =)Fk1'BZRwS vCe*s YEUa*Pz,q\B͖8lLZ~ Q{~L!{ Dt/Y ?UO lI+s9Q#ɖb.{ ҟ^}R0)@ (陪u  qڅKn:4q#B>zH$kpVl{I VEPF:MׂEOF6-`G1U{W|}YC1P/_ ԏ&t'#-QpԨUTj~14bh0kC9m:$2w+x0GJl\uwZ0 .qXFEWGW_>䒪%Ungؔ^E>S3Udxff|ƴVTy:o7FHQ& H^&#ye2u#Y%.9A4:ʅ7+1 zITgs %:*D~ҁ@KoXKg}ow'yήij{1%=>H|l;RZ&m\1ϦA*#*DlI_yRy4 ynfX ;㳿ݭұ7Om6HgjԡaJqyLp;ø1mrbpC&S]ax1̎,հM#<:څ ~o>wR_JdzKURRKlPXy&&R{ aV"յ"ҌZ}Vj'yv\@q3ήGi6Y(LVHVgv0,b}o*0'!Hѫ(f%'yʙAZfFIDIlð6"`!K,BK,Z^koK d VK^n~:xY}yUU1u74&g .c]l'gyӰyRfAY v Ͽ`ݛL?%B^vo[LV2y5ADϵL8Gw~ſG&Ѽ(})W.=8*ηFBg{ӫo֚A܅W)е Fqt:Es/Mu2?`+] =Uaݣ( +PBi=+ a ||/䜷#(1HG^S KJbkZHv?C҄ />lpnr$wX=??G>13X; \ vY0v' |OKJ8ILH_ON027'ץS4=uSS1u_S drDƌ#Y'*5Cdit')ij_&5Lj|)ƸQNSG#$zhD5R!:F*Ty'J*;Q:x{Vbzähr^ +Z^#*mX≓˪Vƅ3}E爫Ũ#>^)JZm/. i-uVia)aU^)/;y5NW$_c.h Q%9@#yb}snO1j91sV㭹Mr2ҸW4.փ#Ҹ4@TIڼ=wR[43:FBR[KVde%+ڒ|}j$rzz&n_G1O̵~YG̙iv,ޏX̰I;ے5'L<}ɠ^D'(Md6}oh1 ~QVm~b]> ݺEfudoNfGq3]mNń0+KvJ3 EyZA,ZzڏU ur MM8L#Y;nqQ0ȥ&Pd#wA 8%;hFyi#nquu'6co[uן=[ _#X1w1ZsJ5 IBC fk&a6H9%}sX%s.LĒsKlPL(δruJ]TJ'y'2rA-h6P*Cv!8a=AϳEU")X&5#U7Jve4s^ r %VTb?6ݸaGE;~r K|ٟfȻ)!T%Jz}`h!<~}{Vq2{q??UmCQEҋoݾ ք)AKٍL6_9vGH+&rCZ(~?8\cZ| ."YNԧ")cBzFӈ G DĀqI6j0+葶4 >xPMeف,6 "z2?r ʢD[+ *X%7EaM3KaGK#ZK»U'_7+Ϙyxwl|ǵe?OQo,|9ߪ??8X\ HO+*W(c1uä,)gs:K-X]*\+-wg_v| iwE:ѯ cn2Q1Xz_iO֭\օ|*ZSǹnݤ< ߭T9S:Fv]Wh8VhuBCsmSZ2,hẅ́O_gYp(ADww!, λ]ZqDooIw:7/a^11yWra 83c`s򤓹4D ̇Rΐ AsL!pAjn-E1r ~B5FFuZ~gpQ&,=_2gO:iyuָ+sD7 ;o*0SMW 1[4{]W?}pe/ǝK ǿK-cv$B@X'?|8p5,=gI+)BLb軝"0C I #(Ҁ(Lü^[QZbbTvn)N0`Aez,QL\ΡJនtUړN਻]9T+dـE-[קYdՑS,GSJgwV%Zʦq6fG(ݞ:\-Y o>?Bi(,+z$.gR"munmZ׍C )S)ӄ?}=i W5UD墂=즰ia/vLyEN B ~Z=2<-hDA?9R1\ ;3 emX#;M!COVpU Q>b>_@EJ׏臤L2 ɉeIr[ ʧē>J92`Q0Gi)aZNPzcW:AJτ])3WE( Uȶ\/rSsU+\)Uһ2|''3iLє(4kaϙ6wʈ\4Sh`#ÙoS;@{Yf`%^nITXd9Lv#$2Wg%f*fE#͙DȔ79 ?єCZo) lt0q 0# PyP1u\G<w%Z]ucdx89FK,zG'[g3jŌfepyƹŌfdtR̨~+6 [A5VńZtvT^k:8vhBU1Q@I0^9hFK"ƯfWnR8[U8\8J&22P$ v~Ht^k\sWH-%CL"dVFOX $:1ʨ\ I%I!H(.8Y@‚L2k`>kf0SgFH8EE- 8wwT!WfĘ jJVj&+-n@emI̔LrMWR|1Bx`ǭsl֌|$GH0ڑ`#seo"_Ok0K9joURwI.!_ic+t@UnZPaW+6yczKj!vjei{2tzp6eiD)FT{YBh18lyƑ+׿"%HUd>u^@S`ٶkG{7=Ґ5-kiSUEVӈf$>;D3931Z,=0e^!/N4pTt{ڨǹj(jOz)8Zp Yɝ{azW *WbQZPWV Z<,.EYts"QK\[=]mXu"Owψ}̅.8=hkfq+[/?GvH83)w[ow ~Y'.+P ^ڗV/^'Zs(uY;{F͚uͼ-sE[?iht__6Iӟ&?]pIps:g"axV}Y_Lz 7>`uq6Ϸ_c6{> r7sh!iY/7>":7uI(gBA5l:ƒ)*VV&X~VVa7ŞT3"g G馸~gx0_L()~un EC0< #-K/Kg)3]5|ࢥs2Le4e. j$qD Ҡzhe&nW,刀0s,Ef_Ddmy-UFh@q>^6G$캒Bi7ؐbͥe(IHB{ + :X+6 V-mI?VJ|x1,tEUQUue)U}UԵOp2$cJ 1rd؜,o 8]<$M\veA^C%8rǞ'جJ7"8ckP(Kal{Z#ΞotdPx>VA-f~dxB"7F? ev/F#kp4P ULtq@Zꞟ SsfwA5Y\tGa4%d&G&,2?0><2e<J(dӓw܌A(Y"jjNSm$*tЎKECY>[ΧNmPs~XCEFʔ, f4cP<%vV k7 Ftɐ%y89G4JA8m`6Yz(}\Ŕ~j8d9A_e=xVirI >9l]t2?GYH~> 3dn\yŊsMl,LfrδiuYp-j~ QI )K\Y6fe +įMcE e޺H1h%6 Q/G>~5ߍ&A};^4ZLP=r[20Kw炟KoG ʣ!,c&xyjPVZa9V+xKXώ̻>jP%]I+hdZr^Iɔ#`NaMÉ1BZt ]*)W* ]k5ZRm,[Y%75/W%uƠQTTQ_h0 qMQU}bȶ#D` OLf6=PKiӆ$+ >I~ӎfj/(=E<pU)u7|SJK?mJՕCo5X V3VsՁ] I\: Pyч"f۾\tPP5U2xWkqjhYD$aubwu$:/ν}:#XZnI4K[sk95B#q(#F3͘^=c@ \`!'Í)ZC2]ۛO?ҒBT҇ߵ1c tNcKIӁ5FA0$fNt NѰ90N#)L9: Cdy( )QԔY~yP-Ѷ"̮XZ2ec{{9g{'@x>8w}ystfz olV[-) ls-n''IOeiivw 5=a9Ęz7䌽HfD\β]<~pau<N~B)Q92:z1^ZJY2Kӊe]VV!Őo̗q2A.G?n3.</O'[}ʘ%RށJ;E^fx(9)4; ql4N;YmCV2T!2 zYsa0ّKn?z@ئ̷fV;0߁w;B: 2SHTbemX` `U XySqimY`Kbt!n$fj,j$T5|a"쪢Ja.})JtU+khPI]ecEˍWFa:w9.RռvѪߧʚ"TrvBN<2218N&7o?=9zܜf0:phmHmNں՛N#qhr\mbÍC;DC̍f\ɹQ#0{:Zf-~y~7~ÖKCz!ܯkuA [gW}(WoC2}w [z12g3/ʼ,QCly؛-;ogYq#7Rt=2L{³nfhl3xf; <5 B.L{XeSʊ`)uYVTʼn ruZ[[US6NID ?ܻ$O|NjZ=Keu헄ؓ;r$ 2 lfCdTTeNj%.g'H$TsL$A9 si54v^29sGǐّKzj mLňPVk%R:` nHʘNjVp(PZ*Uk]*Zc^UJlC2[FN^4gѪJ|QpZP xxS&euJqFv1^)zmP\EMnLzZ*φ1BPhLviX>Yr`I#>K@Ly֎"ں!&kMp>5<3ҿu}o?2 ?,96oM \rl#ZQnjbQ\o>oq!?džo_KIVV2hꕗfj˪X<[{*r5 s) 2@͜wCzE\߼8t;>X13lƪ-Y6dHh,W"b&*,+U2S9\@eoDas1"e J .#A5_qkQotܶ_}ܪ7aclۦ{܍QWƉu4_WŗLqNbm\kf\k|x@y{ie*Ҋb}mQ"c=SNNK!lnjkXWm7o~ڈU;_a.}2KU|[tΫ]zk13F-e٣IztU@UHx|U_it›*tƚEB0[^@ EWH 9p`OS ALhϴAVׅBŜ`Ov.fQEj5R*wtQ`<2MA5QN=5tq;`ؚc6,ҊĆy`0EO3%} z,쌸Wǎ~#1x6kMJa,x4?(b۵v)c;po1 hkV8p@C:+ӏ}hQ}i@qKnLAa:]+|6>Z&:$xk8eF+abq'sp_4ɢJV59`I i'Τ` L`}KV0~zr1qœHʸA;2n$n}VZ'#'$5'(IE"I)EzQ7vтVtkLk9ε WŸw3 dMFK+҅yNS1Z/T1άmeosIPXG`Nxjmfp9yFy[3[\X:^杒DZ9&:`Rx}}" B-j+\*\L+g*\)421:UAH",/ĊfAgX BYwɡPR}LHwHp |<bx(F%QD22ė_RA"PU^.: zv: ݠliLw,>_F3|IC$FDcbXl-]#ɎDI()"JR%;4mQfGwwޟMmOA !LH~:$߷f D~AjF3V(0ԌLV 'WyT̀0 Γ4 58$ hA0rRD>3C`K-FB~!Z93XuWhlk >Ө}M#֠njFW9Zc(}w#!_o} [dVL &qJ^vSpǁk/==&d,)t_,fIѶ)Ã;*9M(ΗnvUDBHN$r6^1QVEWtfʹG5cx?n}^|ǻ]o1GmͶ[x1GքsmS 1UYXOEZaJdK*blnQieqW f^-,:>DqU2gn'giԃfج7Fa^܍ǘ2H2%ғM`.Wt~01\|ޯ0Q2ϾR>{;[վgB55$TH2#$$Z#ET+yvW| 3.9Qn'*ەOѪ EXac.7|x(>ۅOh^kO'x]r~o}yPFzh5!WJNV1~-Q&25PnpJ "JT)C҉30Zv&ATW'.H ^uB &;v5^q'u.t=qNp$e"ĥ:3]x.<,_oݪ,QCE9MNI~؊ MIty>xPQՖNٍ /&8GI5ď+tA!ִ+y2P)$ 7 uE-vG6*^/mBkN E#&AfNYXqSHcB4њ yΉ e(H&%D(k抔*,U"/@a 8p/i !VsC^ 'PT8:d4fBh$qBh 1$_ݗ5nCTlk=L6M:gI R@4%4O(S<5<6 CdE,Yy;gTIQ]a]*)ݥKac3o(7rJUʎ"Hwr?Ll1RvX#Du V춶Q{JqP؛:NCg ~-w/|cȌ;R9]1\_{6݁Tb>s`<q<,#'+ѵƌ)LZrSso>1ڵ`cgqDZ=w|9 PgLlU?+52='5hF%7c)ܣ'cN꿬b3q+]rlf6Ya :s BnV΄}⠰Nhgq,hHhЃQ]Czp5wa=p$cE~x@)(=cR'EY7`i͠Rն'Kzp*: \wnA6U׻u8Fz7^I w tBݮwN+qW-<ѻ5a!DlJGܨr[ Nc3b"̍*Jn6\c,䝛M9඾,S6ښ.qG?^ItV2Gؖ"30paJQ5sL.L)b$#`eDiG#'6o4Jij9j\U Ifj&4RQ(- /'yC 9R/& |lh\b:n΢ʿLcϳG{ N` ᅬ@厠! +]k)䦓&3kvW:5!RMwt.߫dӹ]~M+Fp]/9hgg1>dݓ^CT/~=FCE鶥D/_.v?nm>SK*R/1 n ;ߌFn\Vo=x՞蜶~Dgb=ι|10ARg#*IE ch<9Up89Vj wBYEw -y\Ŋf*Ѧ9˳4AHD()Nh00JiB4'yLK^TPPuu3c_#].s"R6Pah|1L=WȖٮ7qc(RK๺";wNC4ޤT+RghpuZMkR g(k=Dz?Lgr>B%)r+讒uN Ѧ~GL&dp>52֦meZXIr__Ca5e(jyrE;RU76^a/H5_2XKGNcH/f Dk3AocK dj#L`!m`/hR )ҊĜqͽ Rsq2e\LQiG \fT,N)iThj4QU' ,Ha}0(1P{,T @SZAHS"sřf&#qC`"{v[͏?ٳd_vҒTOIq`*^=6X57*p?nuOF aƉx#!)AaQZ~XO<:3ĹM6`W!Ѽ˖" `/D،is~>`EF=B Q# 8FWnP)R4O[lhCILgg$ 9By𞻩 $xrm]0 H#cӜp,^^ 0Qyñ:vb'9#UlRok Y`l<'uuof]v8FF %2B!9/&T]u^h/GtU'_ҧ'IѝsݲmsC d ^bpk# +ltpr!t lNBnәγŭ #P5`o](,_^9{ѐP"Dp^ػmv/vqiN#Y?iOp:}1g(1DKv谽~ :+\C\ӹ>cG')g7'jݚ*flO[X]oچp: ʅhH?~dTKf5V>HzjҽLՔ;,|p*p*~M H[ٯp wEmef bvw>U]Js`́Iz }MEqiis#Fhŀ'"XF.lyz6N&ey̤!z~[=ܩOh~7Y.;s{r:q{V6mY+.lJʩa 8$ 7'S)a1c k~ASxXZOiJv*K,䝛hMػ3Ow tBݮI[wkB޹Teäw+zP \L'>튳ҏypgGքsoS :EVƿNXg*W(5Gi*WpGȶ%J꺄%+ %n[[/)>S_(b 9,Q҉+S(A^ʫ;7hH@΃Q̨gNȇ_VnoZ6~4o,QkVF_xHK˷ZX&\#nWmvfzY{Xw1&r?|R,`(I<bbf3Z͏.ShG<~2D`{xKuE3:g!{iJ%Ga1&Q')Hy|Wwm@)Iy<`Aξe}#k<)vKHҴ6-uU}EHV}ޞvid~H Ӽ "y),ȦEkiX޷dA;ԫccT2}DȖɺ7򒊲.j %R4eIKYV~tsyڝM@}[ܘ:3n0"Y*9")["`x S铩h5; fj;@Yg^α%OO@e]?SV*[-OJ % p)n+$Zp m!Rq-6j[f%TfJ9)@ɦ*7mMjJc`VBQ- %eKGFѾrKRX"H aŷhx~q1o`r ]eY?IԂ3,! g=RzS"w$aW^7cWNTֳ4y 9Z0˸]8lϑzY;U[0Xiwdp`(VyVwtru(1-X:hcsAW`PN>Ǯ;GՉqTau5povIVv설5SUo)3h"vdѡhc:qiyk磜B-($ڽRux -Vbz 5:$wC3Kx"(l/P JA"\ñ.Eh6ٮ0M-|F<LI^% <9Ȥ#QOٱ :^[):^_ۻ7ƖsЂΫ9k+K=RT6jZ w<3sk#A?y~%Pr;Z!k`2Ud- ȃ$!v崂`HqZpIùi @&j9E0m9n`%4@twoX$ F:E#2C"w(qx'3KK*)Y^G@zBH~]Wqm]Nݿ?& 6CdïctZ<,yӋsG1a0Ew.Ȱ}Ԋ:NfLCnl*JJ^}[ f4PEZюu{R=$`s& Bᢰ |  8w0釞ޘSIe(V0R}o|ϸ?YaYV28=UϚ fHa1ʧHLN%:#s5#QPH،Nr)UNH8)$)))bzG)$gpeEɄ !CQ$Hp;GPK\!B_8Yp7)v^ଡ଼ŐT ?^ysWNvIބá4p< v,Ņ,\Aa.ZW$I5H~lMDU XLwldMPNyh9apZ}-21zť+Q:| >?O u)CGq.]ӡkNyZυ"-ʒO DKCq灓/NBU)Wcl+<{r/ C{KVZ\Χ{y|?4믱C/P #F`=7$ ޷^ "Hoైl@EyRTj0Єt 'L09+Pvےr_V% e ($:eBkXbK7N 0qG\PUgL`A(5}:mxS P=`:zcAP|$9GdI2'hn{C=*Mr9Dsɑ<2( oN6tq_r wDZÃNqS?.N,*3׺Cߑ0$^9{s0!ܼ5`k XO'G_Q:]7rOQH=@ @::}9|faB%F#xV'WLؾ<Wljp:r:ܱ$0Snvf='\ʭ*e\?_O (m1/gcN1Y&5jT; ϨRDRLV26n{LS5Q=q02HJyT?*LjsȬҴ0/hV⠍eItXgvJ s5um,tz-,x-̇~|x)8(G9}{Iс@pw8姟NYYmdoV}8 u՗׷]i0°q7GmUk=k̿,>FA`?E iF׷QŐndןѮtgySJ*rԠջ_,`7fyKr@22ycC'INm z =_vmG "fvv_ l {C+TV7>b+WCLlR)6aY4jf }dD݄dMpWjZnHO]|iC^TFJ[?x7 /K)'OZ21F/.'n^(/jƔ\Tb^ xAk"ҳlݩ^z2 a0<Ʈ7V^~2=<"]})X9Kt:\[_}{))BO9^w)9PLOi“8J nJ  mT!hbdVY6B?U'6ߋ8pDTfG. eԎZV_goUysl6잺AO\mY<twj{gobbK1+d@f̼b3lz_>aIвsLA+% ߆$kˊ%+FX̊)SӬ$ €Ro%BDjx pSHHc\VVu+)eY0Ey0g1 YhفQa1qmcƱ1tiȂ1Ү1 $R…>Hvab"% =AaDaec޾/U\G=% ,LH4yl޳`c7[s ma\`I+&f`^3µj,3vv5ĘM̘r7T,R// F`$sE 2/(8֬>vHN3FFޤDƔaȷ [dNm-7 |xVg/Ar(S&o IjGbzucOE cm$8#7)N;IKͻ !*/čJO_T GKݿ#`L+P(VٱL)Rf΂B!"2q KwHfPKb<`6jC RmhbhpTj&U1 kۆ%i\ͪa7svޅ9UQ1ZFTVjOU`7eZRWÜwF`Xq4 c!fER {f"`V03MO$AU4 %WL(,I2J [f;>.HϗN*;7٘K%#z{WxynAS0XblYK.k%_UOI#!(=7kiBY\A5Nr='(s {; cs: -L/ @2'`AOހA0KGu~cM̵}:9b!rCTC=!(l!p \^8EJe\.gt9n(J+e}6LPP ؋Bb]؋$Pr3fT `yg<qX3z-F;jF&=u4XTQ ]Ӆ=p1x༵RN@w镔; L}?5s 1`fRzQ%V5ީj&UK"(D82*g!Ư1xSyRˤ*owLD(`{[]{xqpHqʻ&vNwz@ t0։FkmcYۓrHr5/O "lxeH)6X <_2hзJHEg_~8lʒ *OgK7M^V6™A-nrb* Ȅ}u:rMDTӱTD!# *K(4yސs) t)Ff\{mgJ*SO|YAdᾝ LgWa~*& T/xA:[7 _ز@J0BWh?DͲjúiߟڸŵqk]}CZ Ω2&au@Ͱ(i UӺ,1oPFJZ ɪ Wf}>n(W$+jf >b3=~գ5iqylry_nuKƲLIJB){(_g3ڰ޵57n+鿢nN2U~xOd*Njp,H'[ LѺIQcV2._(_Ѹ4IL!tS-_E "Hc`WXxWEIB9=R%A5ɴ]x(0EGl;~]i"jn IH(LAIF3$;S K7)L^8xU]L]QMR33{0}4A۠F,i~{3֋$>kv=tLLLf ~.ғyT= 12߆(zȯ)&q±}'`1.u ETke&pZI}u!:\?ԽbD~sqfsϟ^q-WNb^ Q@X##:USp,C7϶Pm(*Է?>,O03-[!6<:]8`wukwj7T=Q]Eg rCz9pSͭ*UƎUouQ+p\]Ԩ@U:6v/ѥFuy5mȩt*Vo")&꽉z/WVl]eO{9F؅] q&&x=K'Â߽\j(}R>X~̼yA~UzE.sΚ>@=#9gQU^U8 k(RQBE.5ZZ|RUDJY*rTՙCdxS7=Jc,ł뾝HҒ1B3lB)G޲RY˅+*.BLoq2rG6ym"! Q!=w3wn5%orܹsV"$vNf,wNޚ(v'y/Od|ɛŁq)lHn0=0&4inDž)(gx> f&h̕ !!Mh:˜xF)Ot5 #\aAD^Uo]Mcq0r@c>)ykMc3 fguKleebq24w̢Ca4ZL6"[.d4Jr4_ZD1CgƼ~̔۫7pӣ hbZѢѢѢRAb"6d/:6Q?+]fuy֘g.3vtf-B_s"XdML+a-̔ξ,>f3MuWY1W$a5|̛y5i4M- zfvOpBwN}:Kt (RW#x/5z6NG{kIDz }ʁb:9 x$FQe)IhODDDX q0[ EɄ#N3(a'J63Q;liwRܐB쀕,5^DohfdX A'weMҋRnV ;b+K0^RK^({N+3~&b VzVnSijHbv2F/:k,٬s-Iib7]{U Ͻ[m[L@nb r8(Q&edKKgYvʖ?m3Cղ&zyDgw#js\SL-@rT3!1Pjoe]]C{UC8`4kh ˕8k?˹(lf,r2lk:l(#^ޅ2߅,{qz#U0S;Ϡ~<\)Nn_/+3ttQGoxų+8z8*>u8;2}8T&~~\y3]j91$tTycSI/, {V/w?ٵlTIG 8&{f2 O6ЦMv>:['ڹuOlj9sh[ҿt85;rmnlT/_hWh7}L(Ll&nX>0:Ɂ01Pt^jWn^6UgMP '|G%vbZ"hԻy_tn5X+7Q*j».;wՁw;`JڋsH[MdSѦȐ& Ϋ|k=˭9Zl0=x5j.o¹-b|z];m_]#Rߕƈq[E[)nVEVNX!R+{>˶RZJDnX)nVI )JmnA9΂rCRߕyt}5˶R,.Sl1.ͤV@rۗlđ睬x@#RߕVD.Jfe53ꄕ:Uo]Km┾e[Ӎ}MrRH_貭Ҭ nR nVIWo|+uK)K#@1Q% hŁTa! Ĕ#teB :R'JB )1LTF!4(؋@S3 ~-Sħ|d$z6~}0'fCtDŽ=oUҧpԧJĤ@Mg}{`v=?0\/]#;|x)6gismTf_e-KhbtN^h\OB*ИkӸúƝi,2La0`pۻ]͛ؾ@=jK0p#kÚk}\d9D;k@5h{!~:0)HstL?:zŨwz qwnJoPNh ,%c󏁦(dR458C*4>+Q159s0m|(ΫsG^W#[R?}{Z*{2} q>d̩ s7uw/z C8 ?E(F0~Y5٪Y@iM-7>(UH~|y+IwOo^;Z,F&ꙺPD Tzr.H(Ns@rzv@R(V$nt%8g|gtytXЈ?-"@2{zf1oFF3u]-G[13_ g,i]/WWyox}\.l-Ma))^xI kB,fppEc ;L3fjCv6>ݮv$;Iy~ѳhG> ɗp6Q08E.dŷS2do]SϒZၞ<+t59UkA:{E+ ~Y\[\:cee;݄O#?,*w&|\\&D\R00 "$`FfUcxANY=7vvk^5ڊ./wԏu/E{oBX|gn@o5KPhy8X5wU ΅3j>h.@2*ѦzvB>h9lW'=*߳uqGfDnq7^98:T OR]YJY;X7V??6 eM~2lM #3!z4?kZT?Hl0zhjvP &ڟ*k*Z |_aF~yw6rbtw>8B z_ Wx9'8]o֠'TIyz`<( zȌ&v`FxxJ^T޳wgPoLBoiL&TI4f˛K]7sk>뭏_~W&@;-~SX+Xg£ -UZ|cnf:Ð&6CiXJcc-haƵqo7ǸU; {zA,p|W (YEpyʹt[u7S&%)u!*unB2 īVm_r>cM'x WSPjy<Oˊ4 +{NG98cdjM(J,LaTZEA2pYF 1FQ;B8ٯkcQ j5ض/:ufY,)pϝO-WBsV\ DD)̜QX0#L͋Rj.qЄID uz{I߯FjN O@qDZ=n 8gWo xl418Z|lj2*a|;&kLKb ֤i{9n?9@"N,1Gxgb4p0f!*$V R o*9]jLc%0o A )ģ ^ڗBmw9 GB1ORh=KU=uAuZ~ݍcMɹ+pp#Jꮈ'VBR6tz#"K] N KmP)icCpES2Dn#*Xۣ.^(=Pf9 d 2pd W.,/ǢyzBʙ qHk",GA u}$ng2__ ,$I9Yxi3R@"(+Rrp=DJ"ZK |=A&PD G蔌?1%cs`Ejƒ"p\JFxJ8%Aťdd[\YsN%KIɸl7%hщ jQkt5 ?;9+Ұnv}s O!__Of6cy䴷(ƏϷfYWDdYt5zY-=1ͬ$̼3 -Cb_$?vQcd:hm~F%D\*إz^  xj1rN6n׊Ͻޔ`[z |˿\^.745 lvv(ܣUcMy0#o.a>mŤQфEv~T`m~gr{NwkrU|Ԫ|[u-= ]֠\_^$!/\DdJӰd[.bD')ڭGL%QHnڭ y"#S6+E=˃}Gv=IkEj6$䅋hcb2q} =s<8 W|7zzrZټM8Gݢټ{kEy|sf{CAtiKƏ$w~RJH|P'Du/JA; fgMb9(%81'j c+0W;%Jb1PYQosM'ZLjMjiۦ6QiQ!i͵ItQ^qv#/֯V[f9I5G^0_Zt㕽]03z㇟Ћ쪺Ch  -@(EH8A.@r Cd`dWKi`g ,PS8 ~Oi1Ȟ5PO }H d@R"jܜSNb+[(2J28vC\+"8ŴORFC5ûGKr2$sgqI)[9JuD8B bTS'q NR\H)(#('JI`cEIbEh* g)!e$+qali<;zܗFF5Zď+4( ZbDQ)P7gԄ6V C <À1@hQBlC(cԸDEQ҂I-n?46JBvrI:cS!v:T$m^7(D@uj%d\;J ()CThΙvk$7#O6'TaNN<*);94iuڢF4"=9` Rc "[*u [.fRMqFZ/TMŨo)U懪6BF s꿳w̄ jyA38m&'HCΙ4J԰@7Z"%fS0Va=l7hs@N^B.$r@sy`V}v:4mCB^fɔT'i7=^a<whX-kڐ.Y2%-=G9kA6!P}TBB'+1&-N65Kt'ӽx:٤ _nPOI ~ Nq֛$ [oԫ)sMC3*Ы$gFN3(jcNc8( s)(j-&˩Ҏ7qm|O qC:;+|7_Š1o jhOvt:L#?zUb|R^iH L2!#@ j. X3ls\ ? y 4L)j)aPh0V9 `@XhgxNi2}Kr;Ml1K ;=NR$ˏ2w^~YN"CfEqּBP&|L Ϧҋ''7>!j )I|ψumi QW,, Gp.ZV1SYa5UlS}T0B Wqpiumq5 ^豱D}4Į n.%JǷ7z}{Vhn*WF>'ZgѭdʨW!=bҘV!m]E8ѱ!M_8 $iTV/@I/v|ujw :‘N3'qA_~SV -8=Thl{HN&?a\.5 ޏ&z OSI(I_tnV:h3%\OD@e} eOzFQI%SPpkչ# 2;ro̹( &DQ-@Ι&&p@WuڲmbaBcÍ߮IuE8G<d^O@JSq|sa6[𾰏4!wq]wOU\^v`#K@hF ;;(&/m:P+ z?^jR9zaZ%huRש?z3ft͍Nm7Bt km3krUfKE-; |_H ro{<9]OvͭtFi%{XװaFp?>w`ںYB^P>W/9xKLlZP!v݁}k>=S}t^ A߇VZVDn|_Y+0>p rNcȤXQjRY%,I&<|O.sc2W]>}i$ Y*b"V-x_x!k#єEK^~LǼfF"ks?LR︪I)j.̎㥤է@c`SDwC26/ EeI)Se"Mmլ7Z/9]Z<~s* My8BiAKʒL ,)~ɖgy61*rhWѥrZ-FA=^<<&sZ/oRђN}r]9--1D  l >O>|ˀO$э]_Iai4M)f 425xٵXU2>N5Oh'缉RH!Bɸk R7s‹N"؍=wF.%l칇hqrċA4.,\vb{_%kͷ!RGp>H=`@Zp,?$yö<\f#{](z.lxGNP۾ qw]/><b*wk [swWϔUԛSp͈ޒ_ řfE_{ϫ~VPjT! ENJr IE\[tC$||R7W+if:T TS +XJ%C[_ 8\*'Áv.0Ѭ5ͧU!kұ{^\ϑv m@7 ^PPo9݄]l Vő9?SԱ6%S1e'PwmE1b9͊݊o~Ɓ.ב#cF%Nc-ՃF׊+Z^VI:l!Ge*Śܥ=4鴋l 6dUNMuv-#s@K cvve_~S\pM 8Q!.%?'FTX@ NWO9l@ZmVq7)tZ@Fhmz& ?yߋ0`9XW^T?PQ优T*0ʈН]#$,4G}c/ LЮ_e-432u}p:8$ JUCQ4"O3 jا%J#. Bh|ADӂ* i:nֺvM#[ NG}|2[N$7ҠIOC'=*{IF^~#G 7cU6N@Nw>5dȐ-+ iCݼpnj6s:o Rg~>*N9o+9#淋ƍX:+ސl~^W~735djK$i/ r]uB?ׯ ]{CBR** f bJ4lr4)E5Lߪ4CfgR9/f4}4>MU^ԙO= MgLom aO`)뽤֧tQ#H\+X9^NrU"r*Nk~X㖳[A]9RsfV ZHk$/pMjwRHdZd$ѹif=9JCnִe.ЊfRIJ4QT.AY9GECYtj/l3Ynti6X;Ju9)sZ$<ju  *ɈBBRkuYP*K% % ksD*eh< ps[x1-U]\➦TqUIjQ@"ŵu $ ,hyoCq bEISM,I&ѣreblUӺჳ|8נDx|n"v3OҒu2#/b>:>AAO/ݧY`xP>> *RJzu}U_= d4 RYu$<Kep 'vὛ:@ƞ~Nz({`>ɇaPG4q6ЎjōZ&Ɇ_ZaƳ25#mLU[(U NEetiM&o(:`S#s+ܟU\JN4q+^XAFZ|4Ι4ܐ44OnԄlgRoFkLB8 T'ƀIr\9UMGd Ze݇ 롥oN\l/Rl[v?Kf+8ѲKW0?{Ƒ LKuC`;`'k#3/1ZL(RCQv{IM]@KbNչ9O]IL 3^2r=mEFA2j @nK9UO=0^7f )U觤2a١˂*t&!kC IUH+h{3d6͜ᰢyŎG7x } 6SKqp|tim!_N[ &0$(K=@Ȋ0-x+IIVwT"A̝+]N*/4/% A?ZNC/y&weyj Mo#s]/f'<绗/ݒkgX1-y+!-S|!o"CۍMg24I2]>6ϳt9;+=yQJ^S1XʫZ}x}`55~A4,6<ûQ'$}TM)uN/潔J)J֪2 Vq`M͋qdbVBXH㫞EK ,Pa rV=I1# f*wReRʫ޼,1Dzbi*+6ysf#ԻP )0a} )hxcA\c= Z,HTt/yOH֩0f鮽0 si;BT8.lԜctUPPƂ\ .Ge#u=0-[ \>!LϚ:_Kʲ)\yŗ6Iq6^V;2qX N+/-^3ǯ1/>Mp9o_^wϫxnci:qeV5vz|~؅\OϞ?O `dL_3q4yVh7XkڬۗJѳbūTv뱽OvK 4_qmXs.4nᾶ/l"8m䁫0u)im+XYR@˔0ۥ 'E ݇jMd:7_N"<Hѝ?8:2(1!iZD, C4 Qa+㱤8BU EcbE}4Z^\D-UQN>u|q^ja-fM^(?(] iXEY%.aK)&J~'Ew,/S(sIkja9OL*)Tǜr)#Hg7?{ri1b^>@ +v|k-%fTKQupyl{$fwoo{]փY D %($TbbBD^U('! @eUk{kCN6ϖ.晋-(/Ƣ-Ǣ1X B{7M O;2ncǏƕ ?ov@kd{쟷 qZ=ܺ n?ϯnn執gi]gK.jhR{B{mcPxK:#`jl_" I/e1-eq]qOq~gNیdt/Okm/.hϐ"[UfѹtPyo1!ٜhkWPծ [Ub4yyͨfkho\fځ,'aOR7m8c97V ;ƪapU#mc;Z@RgleF!G'Z'jtlBh3ЅD|ۣһh IɛfRH҅t;)fj om_?:h ψZQ~:_i#ѨO[ֱ!Gmij/fPF0^^eIUN̾3L]7&[ǦBrm>L!ǒJnlژΰ  硷blB;FFQeY8eR]w8m0 w7:'=:U{( Ӭїo[ vs.4c= T52n~@V9MS&wG38]qgV8 ># >y0ZAzdfQY/1U]d"kV kȵIދ6#DYmK@5m?&=O#J{Rfy )# [K6AA0UPjFP-u?} =c:=:{Lڋ>dYEGOGQ#jEt2ôON`~7:`!iwe`d️F &-]T؆YGB0HnG^LXA׫I&ؕ%VOW؟< (^n&z%1QZi^y$)՝/CИ2`,uno"ZYW7:z*jgh>(8LXɒ9o9͹ѡ˖^L&$8g8sI$GG)TI&5(|4ܹtPi0W(ʈWVzI_ xI8 E2J= D.X2CP( '!3MptO4LRp?l'.oFΒ3ݛg^wu!p N l!!W yaGbA R_D QLcNaaH!I}_*J"ItĎWO!No*?,^-{]zMrPi bs톁\ -C^-=u5># k,&i%yaGԀK0P+8> +Qm0}E4ǑK`1¼OûB[7mٺE^}umc5h0^ [ȟ( e@,Ph~cxD;.r(42UR+- Z`H2|[^~*TAZ~鮟날,Fݛ7ua'ܿ#=||˔VorciW%>FhVC~i&C?~AY|z~?ɶA{Cĺ| uHEk%ý•6C$u;+^ < X`0rGAz xu%5;CQ&Ux Ռ+Woݍ̅t@t1,"\7}/\{3ߋY.(g9\Hۏ,FBLQu9&gI$FSEx)u t*YufSbUa*>/0=S"Ghzo"ֳg= aG] )0$:w#ip$ݤ F֝opϚnᰖҭrS9ŔlpdI&pj}}%Rhh~nqOa%*H{-dADe#5>]w/ZSϼk]#e8뷇&$jA؈6hA-r1,MxW+,w:fȅOQ9eetӼlW lkU[,Ѳ=~R1C.|:)zv;5qV OBqOWod<7/rir1|Z_[d0#=cʏz PXX&2hyGO-xh:o9P*k}ԩ"LèKUAQj_ U <ާ?l*G3NI*<0c_é[ɫT/off ;yͦm V$ =$D։$!ӃyʚR={oy7QYcQ3+*wŸɐKw2L>{ǥX&;\K9TvRDL*;H  $f #,mG"WT@r|s9! 6Rrsq0MXC=mfMxԛtP4HpȨ@VS"]M>z>>\b{z:jOSm=8OPԝx "Pzw &!W? ;wW}Fdf˷%W7}tǓ'?ޢzOf?=B.g?{nd?='~|%%N$WP&]i#)/jzNPy%%RK^Kt,YT..=H\fefJ+4Ԗ%%r"DqXQQze XrCX'Y&D#rbP1 { rtn&_p@r{/J]7[l6$()FM@5i -!98s=P"D3dt->:1fbښl&Ou/_4C4[6ӪLA:+ŵ{@'"\\N}9{nNkGil~pƉPzcԔR<*5Bff8Pɬ VMrCHÐRdD"cuN&%i}[.oPĺU`lC=Y Fx*f6Ef3`t.w tBDNܡ U ]:rgP)j@%w޲AA7^)@P7: ތ(S:q -ڼYEgIaU9ZUEQK،@nLE4N†@PҦ/ s٩j ݒAPi"FW6>P-G kIjQ. mCR;G_bǿUV/N[G c)0 [͏˷ٟ RSRZp8{/rzMJlug)':R$Z`W-ܫH%0AF"R%:<n *JRkV<Ɂ OWk[CI9P^[(N+rMHa }_p?gʯ[sօ?78֮wF/{?*8FӇU0}:w2 #y?=?D->u+.M@ŐZ>NRLYe# cٟ~L^/άϓ8,U@QvA5=A+p4dB(iN BHj#W:? g`M#^]rcgq S 7fe7{5B@cWe|_?-(Dby*Lpd\TF[A+ёlE>sWsw{x{$m8_5h3 ;K(]h,+k'BU95Y\r37m-q ߌ|jc[/z+y\{7\هX5AK, @\^PF"pmXi5k@S,T J@ޅtg;T<8aު%?%ksBN)"k19вj7<0yL؄m۾>Py 7MȦBv-/+ u0ЊTR:Wm $VaۥW5 À+XL:e3R2$Ed vNn 8}_}mzUn5]N[gpbk(M< fo>#tEFtͣeטZ3n^lO) ִ~& ;zP#Jwz!QAgؾF`(6Ӛ`F/o5݁ET~xwCw{W6(/4}鈣 fVF_wpysc\*^[#ֺ㴃i6h84 u:ӞLJ<>̏{}w8F鳓߅OC ~􏆘p&p\;EA( hƫF511BmqzݶׄpFS* ]}oa<әt M1yN߮NRB09;1v8I {x,* J65Ee2N̵+Eě`-]`r\ wωw |o!L&Ïw t>^',LU`qOkSbfX99u4+xv|9 $a<<sg??ԋ]o'=ٿXp}Q=)u@Y: CW-FMHtHr[TButҀB%'-?l)0n5S D*)T#P Votjߒ\ɝ+ߙ\F#3 pݵ}tQ Bym;ta93bұe90>3]CsZlu.AΉMմU,DG4y}v?ZJXa'Dy1ɛgXԝzQ3[q2d;IYYckd`f~5ӏ[ d3N)BƤ;dLT@""/^N7/9885A-gԷػ'xvB>uꐩӖUnФLa@:eyjKS$WԠ0 T zե^ڛ|NË[u9R+qj9K*sBc5gRN>Ω kʗ?'!PrlS+E(K~lSa7]s9n=6O'9? m'MrW'UWʹ t*zrug\ހ+R`E\@*st>;ޮИCPޕ-C+gȶ?GU[L#3ih񳗿q?v!Iu-yj/N6'Α)9;j9<׊ ݍ9/QHlBί++!ora`?o\;FhC\m!%eZw시W@rC%MgT*NK0Y7 G2 &>!GU\&# 7U'x+Ԩą-8zFPz+-1pH 8ҷ/3?WCz>ì t_Nv5T#ՕGYN s"5938mt|y[M(DzjVE[pڙr b_}mo4pI`wq/4$Axh{eտ/EvYg3YT<kKAԞ]r{Jʅ̄"ru6Hޤbf8tRb3_jt(QZ'"\(Rx#6rl(0{M4 B9eD*aylTY 3E-UV Z (,hC !c)IM<6pe+#1`"DBˠ(54 & 4S^E2 a4Ee2 --[ܷm"V c Cy6e 7ֺtO p<ۛjᦀ&{ZeA1r0q'\Dl뵗LqZ)/'>ܠ la V+Qˇ[-5to&s./"P)њZ-y!n" buǔjuR=NA8#[=L'6|xo`BkfCg7WYudZ 4{V;%&MZG50q+`Ȭb`,= y̶mتDWM̥`(ǝ`O`$HMnD@ 6&6%(aITI. H$ &d ,F,=^DUQJYBg%jk0ynѨ)L;~>hOS.,1Jvf~z{/.T?>}f)!on_mxғQ \^_6Wx8 w7WB_/~+;w&+d{$!R ͗7#kNRZHv5݇\xLERr Dy8DPa^NMJ!=#ȣ #Y"M̖rxфBSG)X_#3a9IrS&PyJƄ{@ a0Mh*98Y̠6V*1…{H4Qq[T9H~|'){@11 F <1MMi5,鰘|Uzײ9(\-.#f+˯\1ܹKk#0#~_@{m9ß "֪0=KU/A;G)mz`azp/U vGYiHdB?31>p}[hg ]a\X"\寮ߎǻ=x7zpo'3z7'KtZz_jLƃ/4~>QᣟPFڣ ^<윷Zc뺉#JlmI='8!Gs =U mZC>ih.=v2tBXy)GyuyXԹFYr0i ;bUgt'C*[Zg]nN N+ʭ^˒bdt* :hqͦcě$@gra"6rzb;{zu."`F@P69wG]TUA73A]8.O>MKi9H-y!5vtJX,I~d`,woORҘ eHy0noBͮL|X ((U?q:))A4`^èӛKL '$ 홃qН'ʗ ׻1.M޹ A_$6g1žO[.-8/ݦ{0G-e=K,˻0/ry!)ũb[_gэMOpa4woZvFsp铭"s@ƇM"˃(rݓdfsH@؞+bq!Kn+ dL?uB9;JmG`% Պ:=Z ʬ[G|Nhi)dۆc*PEñ ))/U/.ɞ q Y .vQp+S~N3"r+??{bS,IB^]0+cю}tYҵ_f/W~Dml%+#!)E~:tlF=R)HGVK#^g-1V=K{1f=n ߛ4HsױR&eլ5x Na- s3Mwt*Xy8GgKyvd_PGd5 0M^D/k s򥃭 ΕŝY(&ZAr*~K"Eľ`Çſޚss꣙ UկR^^>Z tGX2d=0MNs;q'Nu'+sLZS0i}`@*XP* r]()nqp~*44KU4z-Ig=P%h^ɠ4J`4J}??Y6aEJ!AфcO q8(0Wp*.pU%#q}PAs,5W4ayz[_)EKu A~[C@r @{,/)]ۗUKʲyh)hPet$pJNUɊӪ26[og@5%4ƁH, WVACKPzQh9k Xa,>p"Ht2%.Ly'aCKUDK]K) /-3s"C;mgAr pKcWljŊ"M9CK×PrFۆUZ3]\Z7!#g\w֎Z" L^ܒ Zfw9}q@t S3btx?Lw*>Od-`2\mPD/, JIlYd1jBXȆT)Ts2*A 1tPR2BLlvB*BEVgͪI šw"KV`u1")Q 9YDQ6luT+L: j-A|vѽt 7L~źԧ0**T{}/|=Xi)OӰFY=G3䌶䚉+UU´빘 3\֙3 [xˉh d/5!Vao>]E,O~_xqH7<><qAt]J=\~)LU{Sݓ_p_ }ݡzM1j%ACVHeI8FV RVi{j>j=~,H+D:q=m㞻Ecw A/ST":zkpK' !n G0ٖu~RIAlŭ%^zx[wpH>XdFf--C[k6otMgX%m51@;a15PWFG4Z]qb GNTsJZ6Q I҇pNQ +χgp'8eivoCJ?G?gx6g[tUpV[/iv;ٴj!0v7v{ov? pyGͨut"r.l,b3jL,iЕ7(w~1n UYAc*Uo9MbC&AAF1[,y "0J s&A8/0#Rd) @\ (h Ij#Hρ ZO<"ir̆18e pv7v*^(+e–ͼ<ȅ†-8ZR%' x#0ӂ !25ڊqc3FKem)IiDEӅgJ< 4J§U]*)JjҼŃkUw;]"jP%L!(YOLV #ur5Yds}iʃsxKM\RS93;iPV"ށ+JϕJ| C5Ӆ9,m0NT, wH$kO6D ̣qX ϼS8>2G9|>G-9Y{wa,4AaZ8iI I43KM8߹zБCi* 8s\4c1:Pp"RIB{gz_'ŀq' 2KͶG ]fg6?dKZRb7f"bGq03黨o,{aR`IލhQxXpYMv]ҚiEDąkS@a(#[sݧPRHXI}NkQAoJӭ\)Я㙲3$^>C)Ӓ%'֩2)(UBWpD`eyRX_j !i;eh20O4+)%&ApM}3p&l\ڛKBCpdώ=jjN5zeF /EG9 >$VJ[Ix{\!'3&bbXf/a>$P`xj+4_{(WFh$Fa9n$T|9/'W%I m+[sUp5u8")vQG6Or1?S"#\;pݚ: \sI 虞-[WAK3<ƬFpƬvT #6 $V~MS$=h?Q?4(& 5u&sGh }m~ڏ RK $w~J7ea/gAE+Z/b0V \Lif.|XC|}N?znx&]ɯjux}XMյcxwOfd>O ; Y'z[nX[Z[nߧߟJᷛ:p> %WJvgVU.7/ʇhИ*5eȺRкb3tBQźu[8mu1B!4z}Kju>L WJ* 5!L(zƕfʘT #L2e BX(S?ƀJ %9<Q= 01룉P!S8F C"c hlg6HH( @3 SRNL:`bY큉)$$ EDqA݆Bu X>#FqUpzlPӑa [t`a޼ޮ4jK;K wEA&IB L҆%˨Wh'Z$ $QUP,H (X _ "^]$<3%¢!)xP `W+G2L9tD0:LyROUHڗ%=p=qǚe=p=̎05.kYSAaGҕaڲptF)訐`/))N yg MIFMqm0׷7.Fow< mOɨ bNJŊ(8g`C<1{Q'r](F?S *dSY޸}zh 蛼yrG;W{.H f$} ٌUjv2gv-?D RT:>K҄vq<&4X+M2U8jA(Km=j=S"Qse {m,pJZ/kBIL5mc\%$O|\D=֡d&ɘ̈HJ%IIb8&=*]Oc9]I3;p~H/ֳ͎ɿ B.xuF/Kz8= RtG?/Oա_6PvJXv8oZmh.7-͍in_gpN\gPF5:TJyYhw~YU.Y_B|Y/M X86`!XCW_{2nݖsq,n*9~oV˩.*/2Ab 8IF+[S.{ǁ*ZJcE6mbN >r/")D<6[$|r !E99M3ooz`>I#l2U彲1x?}[h8 .+SQ2I"P}<)2-rbpriF U' hCIRP `\p f[#6O[6A8+~|:ƀ cE{>=6M/X~<ڇǛ!=_\?bwt2+'_n$"c% X`'(C!( 7߂z6m48ѣSlar8Z[ڨؗu q佇%es"OR@liX8 –P(! f治9-یb[`oKtNuY:i?>GXm[Ћ|DRɲfWRȕQЀ#3]dv^h2̸}5zQ`{]vp@?mIV(lQ~Tq,qn(݊}p.yXTV7ëOjq5$WW_1umÀs"cwޯv#~Q/uާ؝m{-w^= \jߣ}OVo橹AQgGI,KF.>|:< ppOV)\w;ߺ 2z +nۯtY }zaz`Vqy}uA> T-ZWj&ϫgq:QJ?&󙫃]/TK)ӑH]AQ8Cc4&P̒D% ]&Ќq=^ݱ\3C]-6Yx47xػ_7,Dy7(v ; !3V:G$g(Gʵ<"j_DS9h2|p4Džk]88:B@@hE4 ɑ9ω\ðfDHs"1 dGI; H}N9Lj6GXPT;&;tKuEGz1MQ4/'*X_3-3(B|e nQ` ݗJ&+AE}=Pōk`y@jMm9;_m=7j)S[fwlwfE(b@x2TTwRLf͘ Ag@lZf<$0$sr%G|~yCxmB-ElLn%ڜ57;#pnҳ5ֿmg "^lbvYObߌ]|xn6&/jOi{5} 곏H=H3NKoM͆we>lϙ/5&ڼS{ѯAђl{$\vO^ZGimpN*/)i lHC1,I?ݺxIϨը`R^͹C7;zyx$B3Y'EtNֵX*Eۊ$+EyxhJ]::h GEg ""i%qد^*;3l v/oy/(qk%>ؘBm'viX3Dj 'ri—)zL 54Tp&Lib'#ȀL@ Cꨆe36 HL&I(T8QIe }g *%qBf*,+9'MuH(($KpcTgBB KSB5H"U;,DsYmaU "cuڿRzOnoLx85zwio2| apͻnWu *Py+U<՝ִP]ZREA)ԵX*Q}R>B dTh*^_ٙۋKQO=Ԕ' )BuӓBa&qpPHh½۸fֿz"8Kǵ,r 籙A? ;^ ]Ph8O:H^w7 1IaIrfdP!Qq>5q)ie$QM0AFUP `rJ^xGAXW:h }:&5/0|)UЗ-N^Wi{-"^?:@9k]yʷFÿFwv 8.~+֣K|".#ߪ6f`MY[yuHB!-5e/!Բ)/ޡsW pBacW_6Ȃr "5o&X1=&[ 4]L3ȣ$RĶO1@#86BW,E'E/qi}Z(RT2Xx ;B+w Wdp((#'9y0t ۹Ǹ\vȔHaHòD43*)Ff dRĔ4 @KP 6YyxM! ё^/W={4PX?e4 6Yуv^&gc-jVK?3V=zBBdH7x[¡\;Y^xxG!Ǡq {i-ي>}K)ķ J5ٓE4{k"ҮmIE>*p0 a{7֭Eoh#mMwƇ\Y"C *DA"P L0B 1)@{!c?{㸍aGnCpK Hu;qێ߯()ayړZ*~U,)V}=iWm,kvls6 T,>AXptZ[yrXʷ4Rv'|4BˏORb !ە EjyUA*/ͪcT5wrC {%:O+%btR"ŭvNq&ݵѐKMdszԵ>K]r߂p,tAPm5P=.Cusqaczo?? RŁ{\RX$f,iP"e ;QK bt.5@EAkԝj_rl5VȄi6fר]{52̡uLr^CmajuPKv{łog}Db덈Y1#+(=Kt%5)EHF c8p`R,h(#ΊQY!#.Di,LJo)SI 4cBbLH(c `DঘB`F/3ΙILPR^R9ER$3La@+ CNTULBzQ.KFpp+0RDob`X:FnP#P&ha[ƽ-Q3w,%F3"S0`D$Kr,IČQL:J,I,cgswuٕ߹rS|3XN>/ugʚno1}(,Y.oG?];Wcf[$E1,__9  1`9,+bth d}AWh-\/"1cUѦx?VOcu]^4/{/{6I(Ap" iuWV2`  ȥGĎ%'իL­!'1H۬SwUXoYE`nW Xd rNk Xdª%RpbaB0Qp 0Tb[-v5IU-u"KՉ, F炠xWJen9ewVol@TL ĬUŒb}iT)Mc m -%IR-5"t()' kcDE;#B*O-Uy`.ErRRWZ'KMӎԢPYRbR=TK9J^-@`:,䝛h%>yǻRXѻU tޣw;p.Ci#V)\6dI"Jȳl)S-YI` լ@ng Ev]o0;|޺uU&vRݠkD~;|{l fcGl_SgaL"4J7:ɛhHINi|` mFțKfGZ&o.mXL /њCŽWhcۛj NY.Rpt> = b"9]>*=!)OWYsBUM>ZdNSsKIW!t*b+/HbP  ܕʃn5[LI¢ KެkLp/[%V%%5HH'i¡ Z$,6Gl7$$>H_ W2$\Np~پ)6ΫyՔ_eɺFG^\ktf{:!:xwS`9/pZZnf˿GΛz _uE,(~vnwpƤ Yq%=+q(akx/Ty߃h6%%_%0]7ϗY&W8x*?z& +}֟{}<\ S7{o mpuX7N0  $X l,):9,H #*{%0rl넬d ǫ?ũ8,ɽq 90b0SJ!`'Ʊ^K4(A(R`@hbZZŗۗս_]ս_]WWWT~q^  wᓄ,19:뢸- ft=u* cy4G+CG$ (7VkG48 QBaXjȘJR`ưR[N]ZARtpk6?G:_6Sޝ9ʔpְ XDֈ5.P|s%B9ʃXQf%BsNylOf.Ma%LV?~嗟JR = 9Uͱ&o{.#;!_ˀr)wg~vMջ W%~ oQOkf_! Z5! i|avc_vrz t.NU)4yH4mdPG2*TD6N݂)=֭r5(|.N[s΁׾֬'ɚ^9="*Ui5}VnHgDnMxPEF"ʨ5M[-bYжЪzs,ù6kW[h7 g8AFž\bȡID0mpQ(mQJZJz9EEuȑN=0\wA <]De1ceDKN +&\@" L)!OeD#ijKI)9}6wMp4yG4#]8A=ԭYZ}.=ZSYn8#0W45@"D7xaL af3x̢<ȧ̇uo^ OI3)Yh lpQLjĽt(( "YY[igsϽsJXD"CB%3L2bЋgmo:M ]qlឦ0"X-fz\O[ $ ZWVA~NjG=*($; !jjTCU1. _g I^PU(PL.̒r[0ʟMf~~]MgY*BsNc[?fzL=GnٗGsTQnުyr`EѵM؉uYItTAT4a=l1Bwk0ji$aD&v(!mX?‰֐+-gV5"aڦ?tGGYG&33{?]t$^z Ğ2zd3.P]M*׎&g_B6c֖!%i X̔:dKB_hcU<8|QE?ɯpgVGoqibjɛ~OC?7y]mTẦ ֍lk{}FtKtdQ=pTxt]mAXTt]ƌ*)&5腡GLzơ:)9,${'REL 6BeooƋ5pӂ"4T|_;D'- JMx){~6!\=5?*1# ^wlϕ. 38ǁ2>%PbisI|O|Va| !BRg4"Mh GNaPc,c8bX+)rxA 8>p\xB7~k=xfy >^8/xUo #|>5]tm|Yrz_lpyK^Kn.eg#@InBtx-sc,Qx23 l UAD`Ih3|䊩\ũlOc͏/^l6ߟޒ@䃺pqb?~🳇Mf?2< >|;|*?>|O߿Φwp|OE#b滟ᓿ) |ߋ3, 1<7 5enz|_‡ޓq$W}n~Dp H|DGl pH:kq쩮l԰?M˿2켵n9]zG7Ɉ(tS`r]DhA6)4 |v p =ezI4<7EuؙEa@yPj5w׬'S%Rq</^}yD/DMN/ƞ=zݍy2{;|>&oѝٿKOvU>jG-A'C-~XjV 2삕s-~Cxq>;RNtwdxOwɤ 2.垒>PX8-B2ӝ%i!6ݧ{Sj&akUXm!|rNV?N\? fm Mc ǡsabz;\EFC#e`L("-h "ƑnXDBfudߵ0l\Le[oN[si.C7BEMMr+&,ATۂt5ÒQnC@Tfto_h!,~.@ H&,ɍa0w`䇹(;@)f/35, 1?7YfI`$u郩]2uqV&nKT{  TcmO9 Mnn~Th崿EKXl B1 k$ tiTlcNb2qiPp0ULlʵZ`yᶼ>:M&:it PFĥGͥ^%ؚܽRB4Z'.=F.ҏKtfzjp1w#R#ǥR4rKK'WTFl)AG\k/w *>+ ﮢ"^Q^K_#8 $j8;qؿLKC) f҄ dQ;6&Blu% >oX,lmЖuV &T7G.'&Rao,SwY$ն''V6$_QN\WoNI.9dtRr '#aվ3HtRyqkOs%kMcu~[ʬAK,Y0o6[E-cO3œKR ĆWJ _d<ݣW@*ם;gtWׯwĞ&8ya k%-ڻh<|j'p, biE6/$$G7Vc=SrvX=vYSgG ų9I N}o;q;!=h\*;լq Ї`ɧřНwzdHwz`+10S{~ԐQx!0w7k1ϧ?hD)}}SL<` Nv^ 80&+Lsdi^W^(mMEieiIEHZJQ]Ds۪ƩZKRqdgһ8=`HA7t.扶 u*.VZLfJ7N.^ Q 7r⚆ ـMa!ig%VoiRP`\č75O=؏D۶8de6^WGÝ"6̷MWbW2eWrizn֯Iu7D%#hT w;$v4V?K5B)Jxu6سSċYf+oRΖJrP!eKWX dnB`1g8&Qq(bCC`60h# "AKD,a!3Q$)BΨpTbMr5+ Ch'!VzB.:}u 롾~.}S8:Y7Pvyث5òAq7gkEAx &[oF},s7n]-Qh!ͺ\q} Jr j]΂,wI=-9I${'.:W!,\ Q,b Y ,JJWTd-F_~;:UKW(~W@Mu,A^ç"U(gK2#J:s4>3 (J L-헤0E{ܳ&3yF`.cL}?;wN.J{gIo܎FߵI<{b-Do^[iaAa)v؍3'Ʃf] 5i6+ba(ĢFA;$1J EQdHpT?wruSi@8}N9*1'R |{,JWDwۗ3z x2J{DrA`r}"i6F=. 鈫LXG`K/I#h \P:@ՍRF[>9U>M6Ñ}w |IdMLQQ:[`y>De׵ϮwHr(,$ fSR! s8H67֞Xx jFF9M)VH)-eD t d I"-U&PW'"k2uΤGt<]܃7aL g[$=j3 IM>H$\o*&Zq:J@~\;rwqF7Veq+wKp#%K}OY5r}btq>UBbYr [>|8A#FgxOw٤ 41#.`>$uP_/@DqI3IBfM Jq)PZR+gJ#Ñ6q CcPPkT0h{ -B*Rp 4ʄ%Bi+ADpb )'8qlcPPSɸڲIa~P c8="[%V*l?ހ"SjޒC()*V kdd<@R_1 0`к $b2cdiw׋PszG.}gYW\)l:'y{;l/eO_u7(\R{ V,&ҿ%pL]~6A Ǖ'\4: pZ<5MpP°E`2/ TJm1mPF䲽 M] lc/J~%J(K$,xGӆ)K dC; iWFlsGanB,'귞:@vh-'ohCM>|j ʍ+ƁV]Ɍ {6o2Tm }B%]{TeyC #YkcSWURAD}IB"QN`L"$dY~ZˍK/3(IdbC, ~`i*p.nU`69UREp褈uDN^QuQP"j2@0lTZXփ'A,D'6t4_N'x>oJT VߺenJk9f 4*KO%jum،reI| F{k33?YIpݵh,tx\neG0ݧ6U5{›4ڛ޾ӧÁWK9]^/װ*-G}(i$F#lkTV9M ͞w|7߉KV/.d"|<],G~eX/2 8R֛!GJپ)nSo91GNǾI3\І2Mf@iy|&,XM)%U2xI!c;HwDah< k)B| [P ۷Kl9G~n<3otR4ڦdqlIHCUII9a.jl2L)uF#!b܌QwLbC`QO%^B4&HR25.)=Ѧl_>C ׇLc;f{W+zx߀O {@| !bOpG'Q̙C]xaӣ` 2L!ϝTOT,$IT焚N wWEE3yc"ZNn*u;y q&uޢ|ʊvqкqwp:܁n]XMY~^<Cu E q-q__G%wf{zPu{!E%0&NW"1Cb 3CY%,% $ERi699M~2ԩ|1r8n2T3!3ֹ|8:W^n 1#VFtRIh{G9JѠ%Jc&P9f 5{HQi(qBDK($zB5'i,yrc}gE"ǩwba 3 F+zoRTEB5Q dgZpNūTTB;{kOp)Bl¢ E$Ġ)rMY!&BEPh (NzH|1D꺄Eo@3ad4@dsDP>R*' zmth[gP9>t1Bmb@ jQTI2_dۇfƸ'SvL8^sMu6'8E2+vަߟo[f=QW?Vr 3JѹAqTF߆F&HE1?'|K_ӖD& ob>}4iO{ȓ0QJ8ק*K#.(&n\ [SMBx %A&aLJ:x>PkFOMogbwד sfȁһp%g N|QB\WD^CR5yqjN;;n_>FVgiq^>+thO>q."zf;B{sw^m~WA"}߱<#>/n:yNDXqEd!ou¥e3+xn|~s8Z+F6>Uʁ$^2v'n}iP":}QN{!OFu1[{^m EL 5ys[NSj4(n'"t<ֻ;Fv@B~p=X)|#綟),GCer٤eO&-Yal*EXhXa`[vh7)V)2~-G'Hhu7uY^Vvr_kSxuo52p]\~zt.crWp QseLq$ތZtg NCCo2uX*ZD1P)W5-%#/7հj ()ye=a :4gp;wCá6@j(dfKㆾ9;>G3/Uc?:5v/_6EI0'ܭp1>Q\*E ךveY#_,.\`Z=^4vl8\/uRgxV# ~.by pow-S@e#0ԔڵGQ;䆡:63 %")cā;@vB Zh 3G)w4A!]<-w }1? 6\|wժF>54*ƃKJ ɴ7盟ۍ[fJ8CҜr9 =!QEv4#u][o9+vEyL[r$9q8۲NlO݊#yeYU.V*Ȟ8`^Kb-&1|TҢ0G NUK5j.򪆂Mɤ ,d̢dIFMC)0+J͗  XjI1tyCcZʼnc~Ū 'Rj5MJ_cYERroA6A1aZ@.!;B΃7?!0B9Y";>Z5ybjz:׆[Jp3h5CW2s8YOWX}`4@gW;8A*H$g.ΓcKaRq0QdJlh+yއ'Y9btژ4&2frrA in|:5ڵ7= 8cCpCrd0?X̋U?݇cl 1nr^E(!&EDl=P iz.>ؐQd^pU'kSX'ېl7PXam(6bKhRc&"sNDe͘P8 v xN؏16 ے>NOq mzeS>P03PHfɎlZ(57ZL㒬DJs)즀Y#cY&_4+yMS'3hf'h(/!0bҳ솪:Q#;~6= mZZ! ͕ =~Dh~j|`A'C ˍ0.>'^Ж۝^_{ȆeC;4u%KG |IgăJLR:v0 C6h<$VkqTL ^%WDj}:̂Jґ$/z0kW0;{kޞ4= }d> `@>fȡ $\۝G4ޒknRjGr9<3Ijg f3#jP_cSm@0VhS$uq:Z=vB7aWϋaʐ'grU}R]L$!\^5ʓjP*ta.I`EGn4vr3Aݿ8jltp'SPoH^&_rfpը"PtV9YZT-(ƖCT( :4a>1bStѶc>2)NkjϹe\EtՁ$>%DXL:}>vj(ZG@wЊBAi(-MAzˣ;v4h:3rySM&MYHE d"N|~=o*8RYa\͡죔W3:k({ٴ1fH8y5O&?Nh%N›RuF,!I4d*HKLarwЃ?2EYס7<aBFљĝM>AK9=[ޜǞZh]pQyd³fFY1,\;cCZε/ ycgx顸:;mBXI~0iW@ے=Yֳ′G FJG$κݼ OqvW ?ӐsZB.I4 'tLQO>4Fɳl ~/gO‡&dMqJFW5QLѓ#b#@4;vœ6dLpOч9^/Ibv$⫋b5^܎/eЀI59h:\h$TN%*?DX)Jl`t5:*PiU<>66 ιfJ2^2x4P,5LzN9Zimu3Җ'=d 2!깝'IP,ZK^RȿYkH6# 鎵v@I^$ K əCi<\vK=';RZ]"Ng_i"s{&,WVe*ѣ wF e1|å@ء-?r`-f`MaȑbEivO;g`=x8#oC߭OֲƛCffk'UY3YrR9e^]^=rlAnfP0Pʱ/NlmeJ$R͹̒픉9kDGB ^ {\{̲jt,-ۈ)ƈ<ϩ;]"0d\ksq0j⡔QΌ sv>9Ųj\֮nN1yrx zdFq#..g}E db'*2xb^(dz:&5X.OÌ̗ :QfaXi $R1`&1P`B*_劭r]&˜jD\s_N.m)8CE2wV[IR*'vma36b~uV W;rz|n5n7PϿ{^rz<;oq?/Yb;F` "ä9)&/sd`g ]Ãx61vvӘAÖNf;v*{?SF)it )=itrğ-|<]CØu;SK86eL8,eDF[c+{r˔> F;T}:PɿK?Ka~F]Y %uS`?gkt V* CO5%l{xESm>^y=|*y1-dIZz9m5kNS* R=I 0oΦ.sE/fDG$_4]LpXjƜ-Oh%!ҵ 6, uƯ-R" w΢/22s:@'{{liwEc%;Ei4W?=/]_K{?w62?M;w֗x\|WCy dί;MQWGvyq$]O얂S?;p^'=KQsɑr[iใ s)ilɮ?8o?u"CzG>{uP;⥄EG :\Q%zCxW*K%9q7_MW?S~şNsˌ=٠9|]O?|}c?/2}z(zۯ:ۃhM~aޒ8K["*Y.9gTYQrI'inDހ_4ckfr0۬Z+2P"!l E!Lv<:N苖+YnrJT`-Ht2%]UbVvFPWvVK9RAHM&_(@fyF BHh%IłP L4xd,J2Tr2QZbL!*kLl ;cu@Xm/.H;jd_^4gI o0 ^ޯ7S?}.õFt_oz8>Ԇ7o~hk??5աӿ?4bZs˳1/'%ooGC9-ruQ|H5%I~Ã{9\MB~>)hکɯ']I-yӥ ]9Jm Ts^:Px#L6ɝ6?^fEuuUJj @PT' Krp0y*OYZ.rv h<_`XrmX%Be߱7q z,*7r5kYvڱb@ahr1N-|R]\5x噗V&RTGC+'X~ݵpB&U&9CPW~d=` [Ai2SSX1roq_Pv%_i۫z w/lQIZ ̩B r q=?EgwtjG[K0M S{E=C@0hXw RYB` [->YOk LrJE`1FX }Q*"k)̖y$+6*C;z0;}'G痳<7JuDG ?8&}kO?dpLq1e'Gs@kqOݙNf"&1&ⴕn=) n?igh V[AX;ӝ^'~$k[zOW Xio wOgzH_!4 tRyD^/ӽaS7 yJ(&)YbFDUtUndX_DddDfAحnTK7$bCN\8nuG-Auװܽ{kXQzC-Kv(1Z*"q]N hwnT1IEZAlF>] ŅXPil~0kcZ:r3Y,Z-,ea;^9D9.Z1 ^Pso"irbߒqm$SvݚҠ4}>Ykrnskj6$䍋h#^-oXvڭ) Nw~#g5/SS!!o\D[]] vo(h#r +.!\!$0nQ$wwtnT|1VggUA2) IA92ֳݔ5)a܂.k$\Y S%,H@Ak'TW<ݥmk2BJjd.Oub8ș$(biTcB3 I 5]o2 ;sy蜃!CR܈;( ^qNa4 Ʃd^Q g!#77W!+紒fd8pTFFUhC[ȍ`Vv8w9{x9}ژ FiOg7Ĉ =+}:5(eܪ[{\N,,g(众FyG:V<佡rm,Eaw~™8?G׹'V,YJbWAHt)Uk0 1z>^@?nJD p}'A 欻чc1?rDx,((4sH[\l |i ;0%(Dư>fF"+ .%^O m3KVf2|n١U䊧x9鍧rDWC\'^$e'޽^/h*=cA@7T+^4{l&xHmFkf>\ w@Z!4"J[(CyS*{itGkPTA 1U<$Q1 ?H h2Rr1lCWt|)r.(oQt|ۣoƇ4j 1hHs'gxwu%&RJw*E4N\FY dpn7h59ҞNO/*.5dv_ 휨(?#5yXc4Bx|RGm=N?UI9$hġGh SGmE-kn𔒯uNŲ"0.iD\_j\UYG8G-.tJk _)|7>X8zj?*|6;3% 欤QtfR㍟6ElEYAOٰ$/U93:p40 uf:䐅Xk T2l\c_hUZn߭JFGYVhz$) OeM՚@U(7Q:f%ɗ¹8WTɏ>$(:bZ9;)LقWܓ)dZV&WvJqA q"70o*TNYۡ^Nb)Pшq=Tp.v{uՃEOWA0pm8*dcxJC'-1( F@赠u)(cr7E#`|93eN۬gm.j ڗwd7#:=WrM8(''BG X9o9fdzlXq#DB-Q WBBR]Q[{!7 j=Ir%:TTejKEM@+|-ʴ4V@ÀISHT!Qlr3!}W !.ԜPNLdm$BC\('L`LvA>1GNLBi۱uwuZ*u'i<qJes8 2e,K(6B8?URySfYw؄Vm^@`h-i^8!۵!\wB JP#% WǻE ˵ -2i? l?+4ȉGsW@3g\ϠB:%g5 %0?b IcvPhi;!K'j>!BwBLB '΅cZB>(%ee) ӛ$أ qZH!DD:cUe}xу]Ńë:F4Fk50VjSoa?suMߘ쒻[8Ud&K d-!ޒxĔ`y%<̵in,.Z%pk%>x;uh ]Z f #%@}^+bZZIN/ AR@EY;C8W2؀xۓ׌[ ݡx\" "4)Pj 1*p+,gE爋CLB}hUPql`YmlIJ%yL+Fx˹`֚ v^A]鰲,d1j $-ntMVDQƍ]۹ۯMHHf):.J,,BRg'l0I =Q.zLJ YӢ 7PH6pbUDypJCh#ryrQm7ri6) -깺vfupFi6-ƳirRZ\NSo|9\O~Cl>|8>f\)aJ0YrYi!nẅ́)/gCϢgI"vheۮ@-=cNn 5Vb}Z*AU%hJ`>(tȲO^ iv9/ g5y!kB^ ~ߋȒ][xSSf»= w? f +I<-Q|mA IS~j*X2aV죿!hl炠.>3D4]s{e4eR04ZQCƃPkE~r߈]M ({_ajцშu pPG/.w՝fAޡjBBP4% ' ^蚼Х6=[ˎ;EOaѵSRޟ>~Zo8W{ }O!]t҈\鼋ZqEtUE&m⑨J%U;wr_ܵPT (#*w028I$nm:8 gAEƤUKwe=nI2!@^I0vƖEF#+V_&V bE:!`f3Y_ddDf-s1UUH/ 1Lbm5' \;e &I&vakgYMS6kJAUz/ɴ!cLgHfXÒKlGor0u^r#VB1 mj1 ~~i]3Qڭ}tY$'yg <@l=gM#؟aUb U\N u$-f+E~vg\hJ;R\]PH0÷\*P3Rf%ٟьs-:)2XhxA]ӈvUaۯ'Sﮋ /r3x2{GzGW~/сcQ["ƴ*B\_66u[GTԗ-\؂زYQf*6:3ւr&__M~q\ ^'/;{g_,}eFX^m5˫QⷿdM/8bf\rSN`3v77ןF[+V܃Win-ڼOJ~ j<>mlףi=q?[?XSR6.X%=*o.7 OZ1SFJ Ł}Ōaf{nIh1k+5f_qHϹɬR6%ncߒNJEsm}ׇ_8DI~:}u6❾wDg{}{`G*.OJ8}??&4G-\iokB,gq G?@Yw$!"BB75&zCh߮ϻaLĊ Z "632eh=[00 T#j=N+$9Ċh< T&ґa!!VE"6RthXӭ@!H.HO0** WݱܪU񩾭Od 2ﳟV 6焴*?< a:!AJY&S #PtmRU/Yk?{QvQ*fl6hJ#8k^_;8 wi[b%W+-&AsTJ[P"7 "xg\ZO5!hVLy^uSm>xԝmO HIS.i4bu3: O5'=^VCjqT"jZ-Jm44޶ٮ^٦aR;1XpI w4vϹ (2$>HRst"<V14%$B!pΘB1QF(63pbЎˤ1(Gs6 cјA{sn4",x.ʃ{F͘HRXi#:8k^>c3Ҽr߅!|N(@!:?B Ɣ!Qi҃ZϚip9Ψ9W&&sFw_sCD+rlحXgTs6TZhNZd)by)'BuXJO-!I 8GNa_r`~\L>r.Ă+4杋1l9Z1! .nx& yae5c=bQ1W O]Ǣ6Bn>kne)jA䫳% I<* @-H\"IHab }"~]탸1TqkrÕkHi0,60eS[0Z؜ vA\>x]r _U6E㽋"ڠQ@$1[NNL^ ̌>G&kXJQ!!Z GIW]װi1\ѸE4>_Ob*('e-(>hirs?VRffK<!D-'%]eKξ,Q"|5EpvKUՒN5\%'~s,PKkW\$NJnQlvrޒZ ęC,9WS,"w20 ~ R`Zۢ5V,~ ;12񮮥 H":8TȒY]jdJ@OǚIDM-=yl=$ObruP,cU*p$bPf 8Dz-Xocc?=O{/[ײsdz$tD>X݈,m{Hv ӟDLxOZ)6JuRah'}Z@'Aq84++96:Ս0:F'=11Tkrh'=}L&P')bU?c X*IFQeNZk19"RR;V(*ўb9’Ǽ>rވ\.S [=hON mg 2ؓZܘ悘 LЈXAi!t`(dT6 kF+%Kak}░?sK-FNaSO ,o,ewjLjz$L=rA[G)X+Q#]ypcscFLj\lj\c$bP]{.ls3pJP` xZcRkMWYMP=[>ue[,AyZ.YN|+cddݗq[{>XyM}^oz.)۵!ZTٺi ߤYgC&(p-W-уitҲ:|GjqXjIKt0{SQJQW=2'1BqqW:%ЭŲ;oo2_~eecNbަ)ަeϖǖwm"1Pi~c1P)OՀx$eMVRnJYtrI`S%?{WƮ_%ws%Z?jg^\Y $:>%C!t`pLKJΑBQX+iO-}R/5@#E0d]3Tol}** d,%bH$M-oPp.jRmwBwvhuۿ_ =f+#N}| n+u8N=W>\+^zg/a2:9YAJx S ~ >AZx݊2I/T @ww-F Pb??!N|rSmR.ۏoK~Ƕ"T)oJw|CZ]X.$]xC%aPޅPMdZS*.҇7<ڒ˿mˎ2?<} @U,k}Vc}[-]3s=W3*wo[uL)VTUeGDYIL{/ueRCdIgӛ߯K#4/oTh^Z=!j4_kyoQ`[R;b$+Qkw4Y1 h|2V50(y%}骁1v=2:u(_jઁo\U5fva6,Slw8xٹfV kʨw`)5y6>PG ,hVK$6>^u GaύQBnoJMg5Jz:-ϏQnoJݮbQJϡQXsxd~7mz ~/.?^%Z7jPY[Ls*qvxG"7\0157R.:ys6 QGLIJbya~.7Lli+ĊR8uqS{g:JVD`jԀ:i ]-`fDyEp; fH">Pȶ.Q4$ XT6uz sΡUt{yYo @/gWB{ʔr Y ELFA)ݨ'K?"Te*JIɒ *Fޘji*,Dc!<"tvuZڦh^{JV7*O2 zcڹ:iq<S?ժ7 [ԉ]/O"X$ndS0G;tNәZAvLJ̔y Ł`r`uQؾʇC#&-3l5h%Ěc^$Y@"J}gxVC 39Qr:W/3  ŘmaN)b2qz9Ѭ[~zu9 [RŊVDvyKVY={h**4ØmbtXUK9Iډd4@izx*:\ط z<.˦QA VD yf5f,k?zTIlkz"|Z~#ǹ.I#C,!! dǯtc/(8|4/aEO0.)F#TJV֑}eWxJ0tTdZ{?q c͋",Saczd䆹۲- l<(Ef/@]$:xqGRYkój!i{}8)*2Rn㩳갭ͦ3'/s6b/.>QZe" )3v'EK$<̔ ~ ف'=߿ڻXH}7$ i/0$ݸFI}K(9;M#Ef9:VIHrѦIw%-4@Ҋ`͆(}yzqcˢ{ֳˮ{LH"=Lh5;m޷TY5F::*1<YH%S SזSriwq==_beMx- 5I$&v#*kD",GpUxlڬEiiL6`8$`E4l2G^xYƆ{n26B;wO=8 Ӱ*ÒOČkl, $ݦ6x͢.ե9nYMd[3q"oZ#8X-WPE5m-8HĮI1cJpx@aqr؄aUC1MoR*pZouyíKpngu>fKUFGyAťpŕ7B!-"m70"  򤘃*.D7kz[iXfw:%ɟ|l /v cV5;M=n{i?Z7;QcEw$d 0՞z=!YQT?֦8*\:8v6iRكt[I)r@wILsBi$OMGoT$v 6KY̅=BDDɌYɸ'h)GsM MCʟfbѓpdHnMS`. :}2DJTQTḰ誻_H~*@isX]1e|[)e9i6G~2g:37>*zV1~\~ .Ғ`="SّZ_ז,ba3"}U6q@GYg* :=Go5&ͯU VUQwZ| lu"}3 E #*8No8&V) ÖSY&RG^^YV:d`^V(k>ϊқf:^`a^~]댱G[=G]AZHHԙ7,ub{J<]E8Cg<DM:͓yWSyB ןWw)LRҪ$5OA`dV]N0w^Vq4oW{mQhDr f?@|t*|u|-Qy2saedLAoc۞; *!H.Pg AJy8놑#(*3O2qx4oK#>=h5vL",y-"l$Fu8Kl|[GusoGZ6_VlM+µ`'*% E%ݺl}x*>dd|>uD~4 v2JdVy&pZ809M̭2/YDfZv2Cf}rgV\ݸZUL9WkBj8VTݨJ <`H,ե5!9}_. g-x0Tht涚P'bDhD2 A)J!v׃{*$̑z}*>//-w` v2PJ(+"JEyafZW ~T`8dS.z#6 >/RF#Pkh5AXH FmzSsd xk3}S퀄, ]+=nOϊ#$ͼ5,r35c)a}\G$)GV@Aݟ_NLKy(mǣfۂb7]lkkNVGavmܝy/ V:kN8s=uo-ֹTviPZ&Ze \.A)QI/W7}$7hm<9G9vJlmu;30] G!ŀg^~bp50>i0XFT*qH wfkZ+܇㋁*=˖{d7O\ }I)U[cp:zw\~AL ۊ4A 0D;|-[K,_jW::,񄰓 ,x)]3rk?]/s`}hB"ԡ%8G&D=6Đ!7n%}6\4ӏ/RO(mFFP|ܯǾa!J,mpϥ!hcr4! 6_AX1AڦUSJ'Kj@v<=ȃ3ʃŻBL֬ėpKu &O!PLʚrWi͑'[K{`9VHug'R=*՟%zXZp1@u|-6ZBfo>'Np*K@7:|ȗB!2tE #0l?Q<0`i@2[ ڏ8&RI%pOE9h؋C>&[Fh, R(e)NG`7)O H~_9)z~ВDK=%xD~U'{HI0ye.y?cp)7U\hx<_1Tȁ^b}w !εymeWC&IMhVuy7&No< rWU(ABgr/3[,>8*ʟh49 ̈́$ $He"_f iCja#u&9yM `+ƃuIJ$! ߀G@\򂹮/(= (ZJ؅s7H,7:50:.dqo=C>r{'[6Mp!I%J.R9PAcV)v@ Ct nT}SWF(ixesIF^1M}<:êV8[XVBpV,zOғ;Z!9,h|-2M'uƂvÂ)DeTHhbPZ/0a܊U?܄*T8DwocٳE#Įk'/u~0Wƻ ǸC ,OLz7\/=q~CK;UUI=O8zP3@7?~Z|XM1Du:𓋋xv6w'?t6?eՂj̩ۜXoA&yۻY"21ZNe f$S—al?fsQ;Wtq~mrO,In3n9,(sC,N>Dk5c=FItFqXcѱo0a {!Ddȿ/jqE'Ą-Qůo V;Ŀ&4ȼ,5!xw+f`bxͩ^X/r r/jw%&  }(YJ*e)p%C_孙{uK] 9D9qD`ψ6 [ s(MRKFiU =$#܊K:?ջ|t|?e/1[mjq!`&tX.53dsv9YHkAx>?+du2T^GQRϿ^sDsszr|9jXEVK0FJ `ǹ_6E`Ǜh;93:_a7]I7EyUf=٧`%izǰVFlz#*  x T3Oצ^wբUYR_0%9JRjw6"|4>+綛Ak6|-7pgюʳ/ 0W! ι3QNnzSN&pM г ; t#yRb1s=FЗYy9+ќ0WW /{C{@{!_l.=J:[N^+܃O~SZ9 >'kY,l`ĝɳWBN'Ç^Yie#Mߵs/ߴǁtD6rgI"+DHREaI907ko3̵TB櫖Z>mh)j'ڰ(&'l HCu'aX} [Q|?kag?:ĺm@І%} ٹ Og U/q>:aZԶ*ŶSHb^3|dzmv)W9o*$iC>S6'l>9hKPQb+ Gލ/vp9ǶHkGgNE2JF(AACИwߣO>W{td8es&e9]=]Sn}W{oFx=|ߛ0aG3x!$[V"/V?\\ :^SSBBkקWW{|l!Ο9Cv7aM=8ݰ7[ADxV5HL;fb{}'ef*Z/ Lj[hͩfVb>鼯,e3>h}#2] nF@FߚZ1\;tֵ`-gFX-v7ՠᷓ3kT`x#yh]h 'ӝ׽e)3w|Q/*gE/M'U-Du7iR V:2)';h ?l *sR;=˛;-iOǽꃶCzXTA T;cOMw{Z5d$U.u!l3Üqϕ竎,͝fa45VO:o\D[ɔV;X7֖MD;6nwecmۺlкu!!߸)Mv>k`Ҡ}&9js۷m_ mhݺo\Dd7)T Ŗ|/}^b)8EgvٛL/ϪŃ|a!Tv6Yp%Y2F$ou~iymsPi%0ΦkDIr)y#p}2jMz+K7@LoaZ s'dt^ŀ œNE!#AE&HD 88g3X7GStg9Z%2]^~0pDDA9N )yL@f$Zj5 ՛r[c5>YZ׻ |c7oxL`Ōż^1EtYi#R["P*,3͜Rs1(ՋT Wܾ.Cb ),7σNĖ&1DK-c}T R]9 V'@3(|bF׍6Ar`- J[R]7^*IΘ8A 9mR( 0NX&dpy78[*i.@>LG烈d%HLJcjHL;\Zd5!&;faN>YZpb 7'UR*HzOH~Kn(Oa۝Ũ}gvzXOaw9Qws֯+T[BO sG+ԏ)A.[X3d:Uچ)`J($֓d3 NЊvm=^3,8p sBrZ)k\1e@c*[6yM0K1n S+C2Xs=d >ǘ|lwϋGِ3ڐo}0!ޕ'0t\xM(orؖμ 8%]%6Ϝ>cԦɷDO3J"Hl$X0)eM&z+K\F p߭ I@0g[ }cC7"C[mHX?#iObpEPڨ<1풱.Ș3Tj8R+;g&aLfMs}&5zb]$&RJ+x"u D$_7"t,+]lMe)jxbO(XcuQ",0+;rRJ^G=NI:l*8a4xY* XAU)H1V)xƂ)iUxaMI @Bҧ STv@qr/{l:Ŝ">!`:ŕT]RIxC%;QxCx# jCK~PJkZoӢh"v4rRbpsO B#$ϣAJj-")Hĝ2&YMl(䥂]Dt퍦p,6St x6Ȍӈ,g38+_-^'=BkGڋ D0w)l _Y!%z&(+B܇E|e  p`fe&mo[$lڒ~HjŚ?ZzX$NJ),[ R} 8g88 W"|8b(&>ғ1$HC Z3}9DܤC|朧ZTӫ&uFBn a\B7e)X Qcue(+%cn݅Ա_uv1aԬPH”(A9G]9.༦}Tt4J:=6P6$ C(%H0G9jי1Tl` [DHLA% k,L YKPCam,i[<3 `5;'4HN1_˝iƂfd=.eDmkU2s8(̶6{d@ bhW3Xfc܁@-h?Hi&=f##J]!f,3?1X!nKEmI!\ [MŌKQ7|ͼjP7i]8f{ϒ9zZ\MF/4qͿF0πl̓&8P: 5ua&# fۀec#vU}۹@{Ĉ3.+h$Bʔjzj(BhY֌\LԚjI-z:#_Wf Q^Mm`F é rr |*jp6db8a!oD;T۲-u: »uAt}G u n+3[MM {7U9Mh-zq2[MfS-fk3fީK9#D2JPKTԨDEuXrnM{0 ,q[IV'lsKň"2P n"3,DZX! A9 kd(n - VZE_t=bAAjDS!w, E9?8qx(ZIGʈg! j1Z9OړhwbB41`KR)"aW)^*uDf7 <"*P($)!͢;6\lVԆ VU©y#&znNiPJf7aĨVz,9*H6 "W%OT`r1WXflYUhJ3P~̮ƚ,5X{_ǏMZ7:7z=+-d֑lj6wcR=k+wK@ť*0 sL^X:V>|kݎXxE*ōRIs(.TҞ*Ӗ]ϒ]<釉}. <)fpB< ɼ+\֯'ypFfLuO-5y7re鿅#Oyƞʘ.D4 %E)WuQ_+J霱g lF&uw,m*z֋6;lּGPZZjV/M iE*U}M͏FU(5լygEcܪ|Y0eo*q0|1ɺIύ cï=:m0D;ݤLkBJy:y:ߡT?Gx3v nE`d43Y3i~pwoLsLsLsLsR5*j"-])K{Uw lQJ⨦~ͯ3;Re/:-z "jՒ!Ǣ,_ښD4_osՊ.`=wr}#i$SrdIcvJ4PĒIQ@iݝ.- Z;ˣrT] ܒ:KnF;t--~3@1 ? bXs&]ػC 1Z}6Slî" J$ڋ^1w?:1`o6d ʔ(@Tԩ!gks<+i[/ m ,؋^xT!Ρ:Z5HrM8c\.g[@rfPυ(ˤZ-ڋFOPg uŽN1 #|yK5ܳ iR㣣aO3>߳37%U,k[ωbe n7y'ڻzԷx7HVbߢ]EnL%;{lu-V jҞ:WL["HnZ[}Zͭ~zZ\MK}J-kuYnZʒ .V2̓Cw__r_1$O xnBN > gw9Y;RP\vv*go&Dw<]n+hM͈1%ll8=*sg\cy<|xn񵣛$@3^oV{ݭb3T.;q~k/PuGuusKCjfS i/kTg^DS1Krz0&E/r6z%-tjO@a3PdcjåpW v}G-Қm#L>,䍛h~~ݪ;n]ec:]Ļ0Byyޭ y&ɦ|Yn!x21hv8'ּ[LևqmmS|m2\dN.Ms39}^YjswxlGwייD 5헨>SME R-egfr `җ>S RZ﷕ gB%+X]ZKTԩ`{l:9=!ReRU K ZRFɨ+ )Mx_uA;JMS m߾G՟|ׯֳ$jbDly8ԕ6 7hM]ghh8:$=7V){0v65pW|Xzyy55]TވΘ,8i(0 (2R+q=тj H&T5 P 14765ms (16:28:16.527) Jan 10 16:28:16 crc kubenswrapper[5036]: Trace[142335433]: [14.765291988s] [14.765291988s] END Jan 10 16:28:16 crc kubenswrapper[5036]: I0110 16:28:16.527848 5036 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 10 16:28:16 crc kubenswrapper[5036]: I0110 16:28:16.529294 5036 trace.go:236] Trace[315739720]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Jan-2026 16:28:02.251) (total time: 14278ms): Jan 10 16:28:16 crc kubenswrapper[5036]: Trace[315739720]: ---"Objects listed" error: 14278ms (16:28:16.529) Jan 10 16:28:16 crc kubenswrapper[5036]: Trace[315739720]: [14.278084428s] [14.278084428s] END Jan 10 16:28:16 crc kubenswrapper[5036]: I0110 16:28:16.529315 5036 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 10 16:28:16 crc kubenswrapper[5036]: I0110 16:28:16.530514 5036 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 10 16:28:16 crc kubenswrapper[5036]: I0110 16:28:16.532890 5036 trace.go:236] Trace[1352311607]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Jan-2026 16:28:01.591) (total time: 14941ms): Jan 10 16:28:16 crc kubenswrapper[5036]: Trace[1352311607]: ---"Objects listed" error: 14941ms (16:28:16.532) Jan 10 16:28:16 crc kubenswrapper[5036]: Trace[1352311607]: [14.941409576s] [14.941409576s] END Jan 10 16:28:16 crc kubenswrapper[5036]: I0110 16:28:16.532912 5036 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 10 16:28:16 crc kubenswrapper[5036]: I0110 16:28:16.533747 5036 trace.go:236] Trace[1274736550]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Jan-2026 16:28:02.197) (total time: 14336ms): Jan 10 16:28:16 crc kubenswrapper[5036]: Trace[1274736550]: ---"Objects listed" error: 14334ms (16:28:16.531) Jan 10 16:28:16 crc kubenswrapper[5036]: Trace[1274736550]: [14.336571419s] [14.336571419s] END Jan 10 16:28:16 crc kubenswrapper[5036]: E0110 16:28:16.533750 5036 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 10 16:28:16 crc kubenswrapper[5036]: I0110 16:28:16.533786 5036 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 10 16:28:16 crc kubenswrapper[5036]: I0110 16:28:16.554108 5036 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33330->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 10 16:28:16 crc kubenswrapper[5036]: I0110 16:28:16.554193 5036 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33344->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 10 16:28:16 crc kubenswrapper[5036]: I0110 16:28:16.554193 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33330->192.168.126.11:17697: read: connection reset by peer" Jan 10 16:28:16 crc kubenswrapper[5036]: I0110 16:28:16.554240 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33344->192.168.126.11:17697: read: connection reset by peer" Jan 10 16:28:16 crc kubenswrapper[5036]: I0110 16:28:16.574931 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 10 16:28:16 crc kubenswrapper[5036]: I0110 16:28:16.586924 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 10 16:28:16 crc kubenswrapper[5036]: I0110 16:28:16.624005 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 10 16:28:16 crc kubenswrapper[5036]: I0110 16:28:16.627238 5036 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a1a143c7481f264da37aeab778a53b3ba35fa1c2aa6a5111aa105283a82be44d" exitCode=255 Jan 10 16:28:16 crc kubenswrapper[5036]: I0110 16:28:16.627277 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a1a143c7481f264da37aeab778a53b3ba35fa1c2aa6a5111aa105283a82be44d"} Jan 10 16:28:16 crc kubenswrapper[5036]: E0110 16:28:16.633943 5036 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 10 16:28:16 crc kubenswrapper[5036]: I0110 16:28:16.692604 5036 scope.go:117] "RemoveContainer" containerID="a1a143c7481f264da37aeab778a53b3ba35fa1c2aa6a5111aa105283a82be44d" Jan 10 16:28:16 crc kubenswrapper[5036]: I0110 16:28:16.922785 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.095828 5036 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-10 16:23:16 +0000 UTC, rotation deadline is 2026-09-27 06:24:25.658922436 +0000 UTC Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.096710 5036 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6229h56m8.562217259s for next certificate rotation Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.361671 5036 apiserver.go:52] "Watching apiserver" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.368491 5036 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.368915 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.369363 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.369395 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.369494 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.369549 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.369586 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.369901 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 10 16:28:17 crc kubenswrapper[5036]: E0110 16:28:17.369961 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 10 16:28:17 crc kubenswrapper[5036]: E0110 16:28:17.370087 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 10 16:28:17 crc kubenswrapper[5036]: E0110 16:28:17.370125 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.371823 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.372044 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.372723 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.373004 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.373048 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.373320 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.373299 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.374479 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.374870 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.407166 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e098c043-2e79-4678-bc14-4306571d12df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3d9b76028a6b1f6b025ecd7227387c6ac179e613bb01e8d8d2947a88be0515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71912e3bacf35053ffa1c8590378aa9a0c88319533d888b0d191e4bce05ae764\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71e2aac540c8ebaf6eca7a56c30aa6f65c2c637c7efdfab7999d74ffc2ecf4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1a143c7481f264da37aeab778a53b3ba35fa1c2aa6a5111aa105283a82be44d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1a143c7481f264da37aeab778a53b3ba35fa1c2aa6a5111aa105283a82be44d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0110 16:28:10.988176 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0110 16:28:10.989232 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1614077472/tls.crt::/tmp/serving-cert-1614077472/tls.key\\\\\\\"\\\\nI0110 16:28:16.535521 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0110 16:28:16.537721 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0110 16:28:16.537744 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0110 16:28:16.537779 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0110 16:28:16.537791 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0110 16:28:16.543658 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0110 16:28:16.543825 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0110 16:28:16.543843 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0110 16:28:16.543855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0110 16:28:16.543871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0110 16:28:16.543878 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0110 16:28:16.543885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0110 16:28:16.544325 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0110 16:28:16.545067 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6164714519a51fd12d13bbf0c74e2ed910fe7e9fb5fc21b0476fa946fc54c3bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:27:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.442174 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.458780 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.469531 5036 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.473102 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.488696 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.505546 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.522623 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e098c043-2e79-4678-bc14-4306571d12df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3d9b76028a6b1f6b025ecd7227387c6ac179e613bb01e8d8d2947a88be0515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71912e3bacf35053ffa1c8590378aa9a0c88319533d888b0d191e4bce05ae764\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71e2aac540c8ebaf6eca7a56c30aa6f65c2c637c7efdfab7999d74ffc2ecf4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1a143c7481f264da37aeab778a53b3ba35fa1c2aa6a5111aa105283a82be44d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1a143c7481f264da37aeab778a53b3ba35fa1c2aa6a5111aa105283a82be44d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0110 16:28:10.988176 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0110 16:28:10.989232 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1614077472/tls.crt::/tmp/serving-cert-1614077472/tls.key\\\\\\\"\\\\nI0110 16:28:16.535521 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0110 16:28:16.537721 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0110 16:28:16.537744 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0110 16:28:16.537779 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0110 16:28:16.537791 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0110 16:28:16.543658 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0110 16:28:16.543825 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0110 16:28:16.543843 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0110 16:28:16.543855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0110 16:28:16.543871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0110 16:28:16.543878 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0110 16:28:16.543885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0110 16:28:16.544325 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0110 16:28:16.545067 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6164714519a51fd12d13bbf0c74e2ed910fe7e9fb5fc21b0476fa946fc54c3bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:27:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.534598 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5e4a161-5178-43cf-92a8-f0342e478934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0547d30d385cb9ff12471f7e5474640eca3c2f9f5ae8a39c751c2f650c3fc6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e08b53a3d87683275ba0e4ee4b22dd9929741e17a4e2246e68900bc15ab73dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72636dcf4fe3412c63af96e24daec814e37772b8760300f122d699a22efe67c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b6bfba50cee7c3e324ec14bc78b6165e04b2f8c3a4878bab6c9a19ec014e458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:27:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.535898 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.535975 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536006 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536037 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536061 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536085 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536108 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536131 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536154 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536175 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536198 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536223 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536248 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536271 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536296 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536324 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536362 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536395 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536423 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536455 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536481 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536508 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536537 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536564 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536587 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536612 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536639 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536720 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536748 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536776 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536303 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.536478 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.538098 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.537064 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.538211 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.537122 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.537213 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.538274 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.537243 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.538307 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.537328 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.537406 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.537466 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.537540 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.537635 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.538384 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.537711 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.537763 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.537791 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.537886 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.537882 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.537901 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.538034 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.538126 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.538601 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.538665 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.538715 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.538729 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.538745 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.538740 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.538835 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.538860 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.538880 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.538905 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.538911 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.538925 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539014 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539033 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539054 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539094 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539128 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539164 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539197 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539230 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539259 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539288 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539318 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539349 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539379 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539409 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539439 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539469 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539498 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539531 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539561 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539595 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539625 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539655 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539711 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539748 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539778 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539814 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539849 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.539883 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.540227 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.541227 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.541473 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.541530 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.541554 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.541587 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.541623 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.541656 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.541711 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.541744 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.541777 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.541787 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.541800 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.541810 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.541822 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.541863 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.541627 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.541895 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.541924 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.541952 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542026 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542057 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542088 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542118 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542154 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542196 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542265 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542295 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542321 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542349 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542379 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542414 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542445 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542477 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542514 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542542 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542572 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542602 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542630 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542657 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542722 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542756 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542784 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542827 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542851 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542880 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542905 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542932 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542962 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542987 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.543011 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.543038 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.543070 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.543096 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.543148 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.543171 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.543197 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.543220 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.543243 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.543269 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.543298 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.543325 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.543354 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.543381 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.543408 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544019 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544092 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544122 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544153 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544180 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544209 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544236 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544268 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544297 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544328 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544361 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544391 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544422 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544451 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544479 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544507 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544535 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544560 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544589 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544615 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544755 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544792 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544823 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544852 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544880 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544907 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544987 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545022 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545055 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545088 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545117 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545148 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545173 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545245 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545272 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545298 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545329 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545355 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545381 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545412 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545444 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545478 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545510 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545545 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545577 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545609 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545645 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545691 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545726 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545757 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545788 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545865 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542020 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542128 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542177 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542190 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542372 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542444 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542825 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.542748 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.543031 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.543049 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.543349 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.543463 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.543557 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.543995 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544145 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544215 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544171 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544309 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544360 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544415 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544635 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.544831 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545012 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545186 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545585 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545909 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.546072 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.546170 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.545966 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.546638 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.546674 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.546722 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.546752 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.546781 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.546811 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.546837 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.546866 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.548459 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.548744 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.548809 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.548940 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549016 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549067 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549084 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549138 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549240 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549264 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549286 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549309 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549329 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549350 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549372 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549431 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549463 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549486 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549517 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549538 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549565 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549585 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549608 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549633 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549657 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549736 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549763 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549787 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549822 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549908 5036 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549923 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549935 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549947 5036 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549958 5036 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549969 5036 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549980 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549990 5036 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.550000 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.550012 5036 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.557322 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.557351 5036 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.557369 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.557381 5036 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.557405 5036 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.557426 5036 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.557442 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.557457 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.557471 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.557487 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.549270 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.550602 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.550665 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.550782 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.550825 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.550625 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.551224 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.551247 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.551494 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.551568 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.552813 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.555942 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.556301 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.556736 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.556901 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.557227 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.557822 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.558131 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.558549 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.558568 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.558818 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.558902 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.559015 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.559077 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.559503 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.559728 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.559804 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.559812 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.560003 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.560029 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.560202 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.560337 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.560447 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.560465 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.560749 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.560822 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.561034 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.561066 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.561313 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.561335 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.561561 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.561579 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.561738 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.561915 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.562227 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.562199 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.562323 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.562346 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.562353 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.562365 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.562462 5036 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.562604 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.562692 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.562726 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.562930 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.563058 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.563107 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.563343 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.563610 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.563845 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.564295 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.564530 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.564551 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.564561 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.564605 5036 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.564833 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.571788 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.576785 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.577425 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.577552 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.577898 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.578144 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.578454 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.578668 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.579560 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.581182 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.581338 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.581824 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.584108 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.584351 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.584712 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.584919 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.585457 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.585514 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.585918 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.585927 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.586002 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.586196 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.586484 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.586557 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: E0110 16:28:17.586767 5036 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.586850 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: E0110 16:28:17.586901 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-10 16:28:18.086871208 +0000 UTC m=+19.957106702 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 10 16:28:17 crc kubenswrapper[5036]: E0110 16:28:17.587010 5036 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 10 16:28:17 crc kubenswrapper[5036]: E0110 16:28:17.587055 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-10 16:28:18.087046342 +0000 UTC m=+19.957281846 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.587087 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.587572 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.587939 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.588327 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.588366 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.588782 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.588780 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.588860 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: E0110 16:28:17.593024 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:18.092998563 +0000 UTC m=+19.963234057 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593101 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593166 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593180 5036 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593193 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593227 5036 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593238 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593249 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593259 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593271 5036 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593301 5036 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593313 5036 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593324 5036 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593335 5036 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593336 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593345 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593424 5036 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593438 5036 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593451 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593464 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593475 5036 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593487 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593499 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593513 5036 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593559 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593573 5036 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593586 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593600 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593612 5036 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593623 5036 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593635 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593645 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593656 5036 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593667 5036 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593768 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593847 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593872 5036 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593892 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593947 5036 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.593967 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.594040 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.594065 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.594115 5036 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.594136 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.594155 5036 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.594206 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.594225 5036 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.594245 5036 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.595079 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.595196 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.595515 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.595540 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.595830 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.597564 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 10 16:28:17 crc kubenswrapper[5036]: E0110 16:28:17.601162 5036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 10 16:28:17 crc kubenswrapper[5036]: E0110 16:28:17.601191 5036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 10 16:28:17 crc kubenswrapper[5036]: E0110 16:28:17.601206 5036 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 10 16:28:17 crc kubenswrapper[5036]: E0110 16:28:17.601452 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-10 16:28:18.101438852 +0000 UTC m=+19.971674346 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.601825 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.601923 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.602726 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.603463 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.604232 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.605425 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.605427 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.608376 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.613870 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.613974 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.613888 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.620197 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.619989 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.620672 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.621286 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.621274 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.622330 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.626976 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.627499 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 10 16:28:17 crc kubenswrapper[5036]: E0110 16:28:17.628346 5036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 10 16:28:17 crc kubenswrapper[5036]: E0110 16:28:17.628370 5036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 10 16:28:17 crc kubenswrapper[5036]: E0110 16:28:17.628384 5036 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 10 16:28:17 crc kubenswrapper[5036]: E0110 16:28:17.628425 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-10 16:28:18.128411172 +0000 UTC m=+19.998646666 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.631231 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.631369 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.632011 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.636803 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.638561 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.638917 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.639037 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.639119 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.639274 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.639275 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.639748 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.639854 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.640085 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.640865 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.642765 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.647021 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"48aa6d8e0f00ddf9a6fdef1b8ae1ee9ff101082f5e7d871c81beaa68344edade"} Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.647391 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.656588 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:28:17 crc kubenswrapper[5036]: E0110 16:28:17.656825 5036 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.658624 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.668034 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.674445 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.693669 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.698782 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.698834 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.698896 5036 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.698910 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.698925 5036 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.698936 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.698947 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.698957 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.698970 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.698983 5036 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.698997 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699007 5036 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699018 5036 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699064 5036 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699076 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699087 5036 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699098 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699109 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699121 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699131 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699143 5036 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699153 5036 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699166 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699176 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699185 5036 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699195 5036 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699209 5036 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699222 5036 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699235 5036 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699247 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699259 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699270 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699286 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699296 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699308 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699319 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699333 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699344 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699356 5036 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699369 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699381 5036 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699393 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699405 5036 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699418 5036 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699429 5036 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699439 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699451 5036 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699462 5036 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699473 5036 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699484 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699496 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699507 5036 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699517 5036 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699526 5036 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.699535 5036 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.704534 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.704568 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.704583 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.704614 5036 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.704723 5036 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.704739 5036 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.704753 5036 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.704775 5036 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.704787 5036 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.704800 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.704813 5036 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.704828 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.704840 5036 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.704851 5036 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.704865 5036 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.704991 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705005 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705021 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705036 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705049 5036 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705061 5036 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705072 5036 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705088 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705098 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705109 5036 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705122 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705143 5036 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705245 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705258 5036 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705273 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705284 5036 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705295 5036 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705305 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705319 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705329 5036 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705339 5036 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705349 5036 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705365 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705404 5036 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705500 5036 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705515 5036 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705527 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705538 5036 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705550 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705562 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705573 5036 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705584 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705595 5036 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705613 5036 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705628 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705642 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705656 5036 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705668 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705695 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705706 5036 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705778 5036 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705789 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705800 5036 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705812 5036 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705827 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705837 5036 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705847 5036 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705857 5036 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705869 5036 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705879 5036 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705889 5036 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.705901 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.706054 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.706065 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.706074 5036 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.706087 5036 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.706099 5036 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.706109 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.715265 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.715373 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.715519 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.719937 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.755167 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e098c043-2e79-4678-bc14-4306571d12df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3d9b76028a6b1f6b025ecd7227387c6ac179e613bb01e8d8d2947a88be0515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71912e3bacf35053ffa1c8590378aa9a0c88319533d888b0d191e4bce05ae764\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71e2aac540c8ebaf6eca7a56c30aa6f65c2c637c7efdfab7999d74ffc2ecf4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48aa6d8e0f00ddf9a6fdef1b8ae1ee9ff101082f5e7d871c81beaa68344edade\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1a143c7481f264da37aeab778a53b3ba35fa1c2aa6a5111aa105283a82be44d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0110 16:28:10.988176 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0110 16:28:10.989232 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1614077472/tls.crt::/tmp/serving-cert-1614077472/tls.key\\\\\\\"\\\\nI0110 16:28:16.535521 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0110 16:28:16.537721 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0110 16:28:16.537744 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0110 16:28:16.537779 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0110 16:28:16.537791 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0110 16:28:16.543658 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0110 16:28:16.543825 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0110 16:28:16.543843 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0110 16:28:16.543855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0110 16:28:16.543871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0110 16:28:16.543878 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0110 16:28:16.543885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0110 16:28:16.544325 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0110 16:28:16.545067 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6164714519a51fd12d13bbf0c74e2ed910fe7e9fb5fc21b0476fa946fc54c3bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:27:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.779264 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5e4a161-5178-43cf-92a8-f0342e478934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0547d30d385cb9ff12471f7e5474640eca3c2f9f5ae8a39c751c2f650c3fc6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e08b53a3d87683275ba0e4ee4b22dd9929741e17a4e2246e68900bc15ab73dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72636dcf4fe3412c63af96e24daec814e37772b8760300f122d699a22efe67c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b6bfba50cee7c3e324ec14bc78b6165e04b2f8c3a4878bab6c9a19ec014e458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:27:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.791797 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.804337 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.816182 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.826895 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.843237 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.860385 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.874131 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e098c043-2e79-4678-bc14-4306571d12df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3d9b76028a6b1f6b025ecd7227387c6ac179e613bb01e8d8d2947a88be0515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71912e3bacf35053ffa1c8590378aa9a0c88319533d888b0d191e4bce05ae764\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71e2aac540c8ebaf6eca7a56c30aa6f65c2c637c7efdfab7999d74ffc2ecf4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48aa6d8e0f00ddf9a6fdef1b8ae1ee9ff101082f5e7d871c81beaa68344edade\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1a143c7481f264da37aeab778a53b3ba35fa1c2aa6a5111aa105283a82be44d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0110 16:28:10.988176 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0110 16:28:10.989232 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1614077472/tls.crt::/tmp/serving-cert-1614077472/tls.key\\\\\\\"\\\\nI0110 16:28:16.535521 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0110 16:28:16.537721 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0110 16:28:16.537744 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0110 16:28:16.537779 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0110 16:28:16.537791 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0110 16:28:16.543658 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0110 16:28:16.543825 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0110 16:28:16.543843 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0110 16:28:16.543855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0110 16:28:16.543871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0110 16:28:16.543878 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0110 16:28:16.543885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0110 16:28:16.544325 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0110 16:28:16.545067 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6164714519a51fd12d13bbf0c74e2ed910fe7e9fb5fc21b0476fa946fc54c3bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:27:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.897739 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5e4a161-5178-43cf-92a8-f0342e478934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0547d30d385cb9ff12471f7e5474640eca3c2f9f5ae8a39c751c2f650c3fc6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e08b53a3d87683275ba0e4ee4b22dd9929741e17a4e2246e68900bc15ab73dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72636dcf4fe3412c63af96e24daec814e37772b8760300f122d699a22efe67c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b6bfba50cee7c3e324ec14bc78b6165e04b2f8c3a4878bab6c9a19ec014e458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:27:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.924546 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.939994 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.951271 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 10 16:28:17 crc kubenswrapper[5036]: I0110 16:28:17.983519 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.117951 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.118023 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.118054 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.118072 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 10 16:28:18 crc kubenswrapper[5036]: E0110 16:28:18.118201 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:19.118169302 +0000 UTC m=+20.988404796 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:18 crc kubenswrapper[5036]: E0110 16:28:18.118236 5036 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 10 16:28:18 crc kubenswrapper[5036]: E0110 16:28:18.118260 5036 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 10 16:28:18 crc kubenswrapper[5036]: E0110 16:28:18.118334 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-10 16:28:19.118314446 +0000 UTC m=+20.988549940 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 10 16:28:18 crc kubenswrapper[5036]: E0110 16:28:18.118324 5036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 10 16:28:18 crc kubenswrapper[5036]: E0110 16:28:18.118384 5036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 10 16:28:18 crc kubenswrapper[5036]: E0110 16:28:18.118381 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-10 16:28:19.118357287 +0000 UTC m=+20.988592951 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 10 16:28:18 crc kubenswrapper[5036]: E0110 16:28:18.118403 5036 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 10 16:28:18 crc kubenswrapper[5036]: E0110 16:28:18.118504 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-10 16:28:19.11847325 +0000 UTC m=+20.988708914 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.144289 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-7q49q"] Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.144831 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7q49q" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.147015 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.147469 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.147729 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.148133 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.175882 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e098c043-2e79-4678-bc14-4306571d12df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3d9b76028a6b1f6b025ecd7227387c6ac179e613bb01e8d8d2947a88be0515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71912e3bacf35053ffa1c8590378aa9a0c88319533d888b0d191e4bce05ae764\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71e2aac540c8ebaf6eca7a56c30aa6f65c2c637c7efdfab7999d74ffc2ecf4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48aa6d8e0f00ddf9a6fdef1b8ae1ee9ff101082f5e7d871c81beaa68344edade\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1a143c7481f264da37aeab778a53b3ba35fa1c2aa6a5111aa105283a82be44d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0110 16:28:10.988176 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0110 16:28:10.989232 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1614077472/tls.crt::/tmp/serving-cert-1614077472/tls.key\\\\\\\"\\\\nI0110 16:28:16.535521 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0110 16:28:16.537721 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0110 16:28:16.537744 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0110 16:28:16.537779 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0110 16:28:16.537791 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0110 16:28:16.543658 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0110 16:28:16.543825 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0110 16:28:16.543843 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0110 16:28:16.543855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0110 16:28:16.543871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0110 16:28:16.543878 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0110 16:28:16.543885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0110 16:28:16.544325 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0110 16:28:16.545067 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6164714519a51fd12d13bbf0c74e2ed910fe7e9fb5fc21b0476fa946fc54c3bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:27:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.192797 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5e4a161-5178-43cf-92a8-f0342e478934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0547d30d385cb9ff12471f7e5474640eca3c2f9f5ae8a39c751c2f650c3fc6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e08b53a3d87683275ba0e4ee4b22dd9929741e17a4e2246e68900bc15ab73dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72636dcf4fe3412c63af96e24daec814e37772b8760300f122d699a22efe67c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b6bfba50cee7c3e324ec14bc78b6165e04b2f8c3a4878bab6c9a19ec014e458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:27:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.219237 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.219291 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fs4r\" (UniqueName: \"kubernetes.io/projected/a6ef98aa-d06f-44d8-a96f-8c261e2521b1-kube-api-access-9fs4r\") pod \"node-ca-7q49q\" (UID: \"a6ef98aa-d06f-44d8-a96f-8c261e2521b1\") " pod="openshift-image-registry/node-ca-7q49q" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.219314 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a6ef98aa-d06f-44d8-a96f-8c261e2521b1-serviceca\") pod \"node-ca-7q49q\" (UID: \"a6ef98aa-d06f-44d8-a96f-8c261e2521b1\") " pod="openshift-image-registry/node-ca-7q49q" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.219335 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6ef98aa-d06f-44d8-a96f-8c261e2521b1-host\") pod \"node-ca-7q49q\" (UID: \"a6ef98aa-d06f-44d8-a96f-8c261e2521b1\") " pod="openshift-image-registry/node-ca-7q49q" Jan 10 16:28:18 crc kubenswrapper[5036]: E0110 16:28:18.219526 5036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 10 16:28:18 crc kubenswrapper[5036]: E0110 16:28:18.219546 5036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 10 16:28:18 crc kubenswrapper[5036]: E0110 16:28:18.219559 5036 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 10 16:28:18 crc kubenswrapper[5036]: E0110 16:28:18.219607 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-10 16:28:19.219588846 +0000 UTC m=+21.089824340 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.223056 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.270021 5036 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 10 16:28:18 crc kubenswrapper[5036]: W0110 16:28:18.270495 5036 reflector.go:484] object-"openshift-image-registry"/"node-ca-dockercfg-4777p": watch of *v1.Secret ended with: very short watch: object-"openshift-image-registry"/"node-ca-dockercfg-4777p": Unexpected watch close - watch lasted less than a second and no items received Jan 10 16:28:18 crc kubenswrapper[5036]: W0110 16:28:18.270510 5036 reflector.go:484] object-"openshift-network-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 10 16:28:18 crc kubenswrapper[5036]: W0110 16:28:18.270566 5036 reflector.go:484] object-"openshift-network-node-identity"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 10 16:28:18 crc kubenswrapper[5036]: W0110 16:28:18.270515 5036 reflector.go:484] object-"openshift-image-registry"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-image-registry"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 10 16:28:18 crc kubenswrapper[5036]: W0110 16:28:18.270615 5036 reflector.go:484] object-"openshift-image-registry"/"image-registry-certificates": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-image-registry"/"image-registry-certificates": Unexpected watch close - watch lasted less than a second and no items received Jan 10 16:28:18 crc kubenswrapper[5036]: W0110 16:28:18.270657 5036 reflector.go:484] object-"openshift-network-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 10 16:28:18 crc kubenswrapper[5036]: W0110 16:28:18.270661 5036 reflector.go:484] object-"openshift-network-node-identity"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 10 16:28:18 crc kubenswrapper[5036]: E0110 16:28:18.270602 5036 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/events\": read tcp 38.102.83.83:53094->38.102.83.83:6443: use of closed network connection" event="&Event{ObjectMeta:{kube-controller-manager-crc.18896b752c059425 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-10 16:27:59.713539109 +0000 UTC m=+1.583774603,LastTimestamp:2026-01-10 16:27:59.713539109 +0000 UTC m=+1.583774603,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 10 16:28:18 crc kubenswrapper[5036]: W0110 16:28:18.271340 5036 reflector.go:484] object-"openshift-network-operator"/"iptables-alerter-script": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"iptables-alerter-script": Unexpected watch close - watch lasted less than a second and no items received Jan 10 16:28:18 crc kubenswrapper[5036]: W0110 16:28:18.271385 5036 reflector.go:484] object-"openshift-image-registry"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-image-registry"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 10 16:28:18 crc kubenswrapper[5036]: W0110 16:28:18.271421 5036 reflector.go:484] object-"openshift-network-node-identity"/"ovnkube-identity-cm": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"ovnkube-identity-cm": Unexpected watch close - watch lasted less than a second and no items received Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.270512 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-node-identity/pods/network-node-identity-vrzqb/status\": read tcp 38.102.83.83:53094->38.102.83.83:6443: use of closed network connection" Jan 10 16:28:18 crc kubenswrapper[5036]: W0110 16:28:18.271894 5036 reflector.go:484] object-"openshift-network-node-identity"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Jan 10 16:28:18 crc kubenswrapper[5036]: W0110 16:28:18.272290 5036 reflector.go:484] object-"openshift-network-node-identity"/"network-node-identity-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-network-node-identity"/"network-node-identity-cert": Unexpected watch close - watch lasted less than a second and no items received Jan 10 16:28:18 crc kubenswrapper[5036]: W0110 16:28:18.272338 5036 reflector.go:484] object-"openshift-network-operator"/"metrics-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-network-operator"/"metrics-tls": Unexpected watch close - watch lasted less than a second and no items received Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.310491 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.320361 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6ef98aa-d06f-44d8-a96f-8c261e2521b1-host\") pod \"node-ca-7q49q\" (UID: \"a6ef98aa-d06f-44d8-a96f-8c261e2521b1\") " pod="openshift-image-registry/node-ca-7q49q" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.320438 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fs4r\" (UniqueName: \"kubernetes.io/projected/a6ef98aa-d06f-44d8-a96f-8c261e2521b1-kube-api-access-9fs4r\") pod \"node-ca-7q49q\" (UID: \"a6ef98aa-d06f-44d8-a96f-8c261e2521b1\") " pod="openshift-image-registry/node-ca-7q49q" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.320460 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a6ef98aa-d06f-44d8-a96f-8c261e2521b1-serviceca\") pod \"node-ca-7q49q\" (UID: \"a6ef98aa-d06f-44d8-a96f-8c261e2521b1\") " pod="openshift-image-registry/node-ca-7q49q" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.320533 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6ef98aa-d06f-44d8-a96f-8c261e2521b1-host\") pod \"node-ca-7q49q\" (UID: \"a6ef98aa-d06f-44d8-a96f-8c261e2521b1\") " pod="openshift-image-registry/node-ca-7q49q" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.322097 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a6ef98aa-d06f-44d8-a96f-8c261e2521b1-serviceca\") pod \"node-ca-7q49q\" (UID: \"a6ef98aa-d06f-44d8-a96f-8c261e2521b1\") " pod="openshift-image-registry/node-ca-7q49q" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.347652 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.361329 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fs4r\" (UniqueName: \"kubernetes.io/projected/a6ef98aa-d06f-44d8-a96f-8c261e2521b1-kube-api-access-9fs4r\") pod \"node-ca-7q49q\" (UID: \"a6ef98aa-d06f-44d8-a96f-8c261e2521b1\") " pod="openshift-image-registry/node-ca-7q49q" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.398922 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.417023 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.426871 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7q49q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6ef98aa-d06f-44d8-a96f-8c261e2521b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fs4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:28:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7q49q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.463088 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7q49q" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.507116 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 10 16:28:18 crc kubenswrapper[5036]: E0110 16:28:18.507570 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.510840 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.511375 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.513112 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.513917 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.515546 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.516170 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.516810 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.517824 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.518648 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.519772 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.520430 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.520507 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5e4a161-5178-43cf-92a8-f0342e478934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0547d30d385cb9ff12471f7e5474640eca3c2f9f5ae8a39c751c2f650c3fc6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e08b53a3d87683275ba0e4ee4b22dd9929741e17a4e2246e68900bc15ab73dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72636dcf4fe3412c63af96e24daec814e37772b8760300f122d699a22efe67c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b6bfba50cee7c3e324ec14bc78b6165e04b2f8c3a4878bab6c9a19ec014e458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:27:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.522187 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.522806 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.523452 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.524583 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.525226 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.526362 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.526834 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.527465 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.528777 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.535414 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.536374 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.536410 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.538326 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.539185 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.545258 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.545965 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.552172 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.552337 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.552789 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.553584 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.554110 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.554746 5036 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.554882 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.557821 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.558861 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.559472 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.561206 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.562441 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.563030 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.564221 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.565069 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.565585 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.566834 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.568250 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.569086 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.570376 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.571091 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.572208 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.573369 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.574573 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.575358 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.576006 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.577370 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.577705 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.578471 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.580193 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.589542 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e098c043-2e79-4678-bc14-4306571d12df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3d9b76028a6b1f6b025ecd7227387c6ac179e613bb01e8d8d2947a88be0515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71912e3bacf35053ffa1c8590378aa9a0c88319533d888b0d191e4bce05ae764\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71e2aac540c8ebaf6eca7a56c30aa6f65c2c637c7efdfab7999d74ffc2ecf4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48aa6d8e0f00ddf9a6fdef1b8ae1ee9ff101082f5e7d871c81beaa68344edade\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1a143c7481f264da37aeab778a53b3ba35fa1c2aa6a5111aa105283a82be44d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0110 16:28:10.988176 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0110 16:28:10.989232 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1614077472/tls.crt::/tmp/serving-cert-1614077472/tls.key\\\\\\\"\\\\nI0110 16:28:16.535521 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0110 16:28:16.537721 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0110 16:28:16.537744 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0110 16:28:16.537779 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0110 16:28:16.537791 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0110 16:28:16.543658 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0110 16:28:16.543825 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0110 16:28:16.543843 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0110 16:28:16.543855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0110 16:28:16.543871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0110 16:28:16.543878 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0110 16:28:16.543885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0110 16:28:16.544325 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0110 16:28:16.545067 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6164714519a51fd12d13bbf0c74e2ed910fe7e9fb5fc21b0476fa946fc54c3bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:27:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.609079 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.622104 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.632498 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.646339 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7q49q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6ef98aa-d06f-44d8-a96f-8c261e2521b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fs4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:28:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7q49q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.648320 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7q49q" event={"ID":"a6ef98aa-d06f-44d8-a96f-8c261e2521b1","Type":"ContainerStarted","Data":"f48c62053802a216ad136062677d93d71934ebf80ac633faafa33b5a04b7ad39"} Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.649255 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"90e72a907146fd8c205f540b2b55f53297d2cec24b13e618eed56140586885a9"} Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.649279 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"acf00867932c9fb578dadcdda677e8f3a50497aba67af65a56902d2cf4eed65d"} Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.660510 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"01d7eb8cc1990d64daf5bd6dbcbf768df02036028d9c7ac1d9885d7dea85a64e"} Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.679053 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"262fa2b49bab0c2eb3bdfe5bffaa593a2b57205dd9ab36e11a772ed3cfb9dace"} Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.679344 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0fb72f484c0faa9d7d3f02b2aa4449ffa12a47eb95f24cf13788793b9c876693"} Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.679389 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4c6cd2f08cb09b56a8f641b0a45c0c492cddd93c10a7f1537c8885540570d37f"} Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.680148 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.692916 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7q49q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6ef98aa-d06f-44d8-a96f-8c261e2521b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fs4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:28:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7q49q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.709561 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.726329 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.739850 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.771486 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.788891 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e098c043-2e79-4678-bc14-4306571d12df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3d9b76028a6b1f6b025ecd7227387c6ac179e613bb01e8d8d2947a88be0515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71912e3bacf35053ffa1c8590378aa9a0c88319533d888b0d191e4bce05ae764\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71e2aac540c8ebaf6eca7a56c30aa6f65c2c637c7efdfab7999d74ffc2ecf4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48aa6d8e0f00ddf9a6fdef1b8ae1ee9ff101082f5e7d871c81beaa68344edade\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1a143c7481f264da37aeab778a53b3ba35fa1c2aa6a5111aa105283a82be44d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0110 16:28:10.988176 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0110 16:28:10.989232 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1614077472/tls.crt::/tmp/serving-cert-1614077472/tls.key\\\\\\\"\\\\nI0110 16:28:16.535521 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0110 16:28:16.537721 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0110 16:28:16.537744 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0110 16:28:16.537779 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0110 16:28:16.537791 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0110 16:28:16.543658 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0110 16:28:16.543825 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0110 16:28:16.543843 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0110 16:28:16.543855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0110 16:28:16.543871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0110 16:28:16.543878 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0110 16:28:16.543885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0110 16:28:16.544325 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0110 16:28:16.545067 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6164714519a51fd12d13bbf0c74e2ed910fe7e9fb5fc21b0476fa946fc54c3bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:27:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.803387 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5e4a161-5178-43cf-92a8-f0342e478934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0547d30d385cb9ff12471f7e5474640eca3c2f9f5ae8a39c751c2f650c3fc6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e08b53a3d87683275ba0e4ee4b22dd9929741e17a4e2246e68900bc15ab73dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72636dcf4fe3412c63af96e24daec814e37772b8760300f122d699a22efe67c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b6bfba50cee7c3e324ec14bc78b6165e04b2f8c3a4878bab6c9a19ec014e458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:27:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.816714 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e72a907146fd8c205f540b2b55f53297d2cec24b13e618eed56140586885a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.836582 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7q49q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6ef98aa-d06f-44d8-a96f-8c261e2521b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fs4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:28:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7q49q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.860802 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.882395 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.895018 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.913552 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.926744 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e098c043-2e79-4678-bc14-4306571d12df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3d9b76028a6b1f6b025ecd7227387c6ac179e613bb01e8d8d2947a88be0515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71912e3bacf35053ffa1c8590378aa9a0c88319533d888b0d191e4bce05ae764\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71e2aac540c8ebaf6eca7a56c30aa6f65c2c637c7efdfab7999d74ffc2ecf4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48aa6d8e0f00ddf9a6fdef1b8ae1ee9ff101082f5e7d871c81beaa68344edade\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1a143c7481f264da37aeab778a53b3ba35fa1c2aa6a5111aa105283a82be44d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0110 16:28:10.988176 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0110 16:28:10.989232 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1614077472/tls.crt::/tmp/serving-cert-1614077472/tls.key\\\\\\\"\\\\nI0110 16:28:16.535521 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0110 16:28:16.537721 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0110 16:28:16.537744 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0110 16:28:16.537779 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0110 16:28:16.537791 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0110 16:28:16.543658 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0110 16:28:16.543825 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0110 16:28:16.543843 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0110 16:28:16.543855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0110 16:28:16.543871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0110 16:28:16.543878 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0110 16:28:16.543885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0110 16:28:16.544325 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0110 16:28:16.545067 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6164714519a51fd12d13bbf0c74e2ed910fe7e9fb5fc21b0476fa946fc54c3bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:27:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.940046 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5e4a161-5178-43cf-92a8-f0342e478934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0547d30d385cb9ff12471f7e5474640eca3c2f9f5ae8a39c751c2f650c3fc6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e08b53a3d87683275ba0e4ee4b22dd9929741e17a4e2246e68900bc15ab73dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72636dcf4fe3412c63af96e24daec814e37772b8760300f122d699a22efe67c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b6bfba50cee7c3e324ec14bc78b6165e04b2f8c3a4878bab6c9a19ec014e458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:27:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.956380 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e72a907146fd8c205f540b2b55f53297d2cec24b13e618eed56140586885a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.974771 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://262fa2b49bab0c2eb3bdfe5bffaa593a2b57205dd9ab36e11a772ed3cfb9dace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fb72f484c0faa9d7d3f02b2aa4449ffa12a47eb95f24cf13788793b9c876693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:18Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.975186 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-xvw8s"] Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.975552 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xvw8s" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.976611 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-kqphb"] Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.976855 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.977946 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-c4vw5"] Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.978700 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.978772 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.979498 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.980303 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.980409 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.980437 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.980475 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.980644 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-5kmzz"] Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.980778 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.981202 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-44nd6"] Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.981294 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5kmzz" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.981417 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-44nd6" Jan 10 16:28:18 crc kubenswrapper[5036]: I0110 16:28:18.992007 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.004519 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.004857 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.006782 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.007218 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.007279 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.007326 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.007371 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.007416 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.007351 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.007519 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.007823 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.007842 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.007098 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.007994 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.019885 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:19Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030196 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c55fd5c5-5f04-4347-810b-dffe41887f83-hosts-file\") pod \"node-resolver-xvw8s\" (UID: \"c55fd5c5-5f04-4347-810b-dffe41887f83\") " pod="openshift-dns/node-resolver-xvw8s" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030238 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqzrp\" (UniqueName: \"kubernetes.io/projected/79756361-741e-4470-831b-6ee092bc6277-kube-api-access-hqzrp\") pod \"machine-config-daemon-kqphb\" (UID: \"79756361-741e-4470-831b-6ee092bc6277\") " pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030260 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-multus-socket-dir-parent\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030277 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-run-netns\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030298 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/baa66345-69f5-4a8c-b0fd-c28f048c239b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5kmzz\" (UID: \"baa66345-69f5-4a8c-b0fd-c28f048c239b\") " pod="openshift-multus/multus-additional-cni-plugins-5kmzz" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030317 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-host-var-lib-cni-bin\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030388 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-multus-conf-dir\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030407 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/baa66345-69f5-4a8c-b0fd-c28f048c239b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5kmzz\" (UID: \"baa66345-69f5-4a8c-b0fd-c28f048c239b\") " pod="openshift-multus/multus-additional-cni-plugins-5kmzz" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030427 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-system-cni-dir\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030444 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-multus-cni-dir\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030462 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-var-lib-openvswitch\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030482 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-run-openvswitch\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030498 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/98fa8c41-2298-4b26-849a-806cc77bcc40-env-overrides\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030516 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79756361-741e-4470-831b-6ee092bc6277-proxy-tls\") pod \"machine-config-daemon-kqphb\" (UID: \"79756361-741e-4470-831b-6ee092bc6277\") " pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030534 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/baa66345-69f5-4a8c-b0fd-c28f048c239b-cnibin\") pod \"multus-additional-cni-plugins-5kmzz\" (UID: \"baa66345-69f5-4a8c-b0fd-c28f048c239b\") " pod="openshift-multus/multus-additional-cni-plugins-5kmzz" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030556 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-host-var-lib-cni-multus\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030571 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-etc-kubernetes\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030601 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-node-log\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030622 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-cni-bin\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030639 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-log-socket\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030658 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-cnibin\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030691 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/baa66345-69f5-4a8c-b0fd-c28f048c239b-os-release\") pod \"multus-additional-cni-plugins-5kmzz\" (UID: \"baa66345-69f5-4a8c-b0fd-c28f048c239b\") " pod="openshift-multus/multus-additional-cni-plugins-5kmzz" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030708 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xdb6\" (UniqueName: \"kubernetes.io/projected/baa66345-69f5-4a8c-b0fd-c28f048c239b-kube-api-access-4xdb6\") pod \"multus-additional-cni-plugins-5kmzz\" (UID: \"baa66345-69f5-4a8c-b0fd-c28f048c239b\") " pod="openshift-multus/multus-additional-cni-plugins-5kmzz" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030730 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-systemd-units\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030768 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-slash\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030787 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/baa66345-69f5-4a8c-b0fd-c28f048c239b-cni-binary-copy\") pod \"multus-additional-cni-plugins-5kmzz\" (UID: \"baa66345-69f5-4a8c-b0fd-c28f048c239b\") " pod="openshift-multus/multus-additional-cni-plugins-5kmzz" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030806 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/79756361-741e-4470-831b-6ee092bc6277-rootfs\") pod \"machine-config-daemon-kqphb\" (UID: \"79756361-741e-4470-831b-6ee092bc6277\") " pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030822 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-os-release\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030838 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-hostroot\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030864 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/98fa8c41-2298-4b26-849a-806cc77bcc40-ovn-node-metrics-cert\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030882 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/98fa8c41-2298-4b26-849a-806cc77bcc40-ovnkube-script-lib\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030897 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/91a78516-865b-40eb-8545-8f24206fe927-cni-binary-copy\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030914 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-host-var-lib-kubelet\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030937 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-run-systemd\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030953 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/98fa8c41-2298-4b26-849a-806cc77bcc40-ovnkube-config\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030969 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/baa66345-69f5-4a8c-b0fd-c28f048c239b-system-cni-dir\") pod \"multus-additional-cni-plugins-5kmzz\" (UID: \"baa66345-69f5-4a8c-b0fd-c28f048c239b\") " pod="openshift-multus/multus-additional-cni-plugins-5kmzz" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.030987 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-host-run-multus-certs\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.031012 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-kubelet\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.031033 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.031052 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6zqs\" (UniqueName: \"kubernetes.io/projected/c55fd5c5-5f04-4347-810b-dffe41887f83-kube-api-access-p6zqs\") pod \"node-resolver-xvw8s\" (UID: \"c55fd5c5-5f04-4347-810b-dffe41887f83\") " pod="openshift-dns/node-resolver-xvw8s" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.031068 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/79756361-741e-4470-831b-6ee092bc6277-mcd-auth-proxy-config\") pod \"machine-config-daemon-kqphb\" (UID: \"79756361-741e-4470-831b-6ee092bc6277\") " pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.031085 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/91a78516-865b-40eb-8545-8f24206fe927-multus-daemon-config\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.031114 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-cni-netd\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.031136 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-host-run-k8s-cni-cncf-io\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.031175 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d86dc\" (UniqueName: \"kubernetes.io/projected/91a78516-865b-40eb-8545-8f24206fe927-kube-api-access-d86dc\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.031194 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-etc-openvswitch\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.031209 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-run-ovn-kubernetes\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.031226 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-run-ovn\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.031261 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b65c4\" (UniqueName: \"kubernetes.io/projected/98fa8c41-2298-4b26-849a-806cc77bcc40-kube-api-access-b65c4\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.031275 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-host-run-netns\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.052097 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:19Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.068511 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:19Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.083221 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7q49q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6ef98aa-d06f-44d8-a96f-8c261e2521b1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fs4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:28:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7q49q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:19Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.108515 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xvw8s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c55fd5c5-5f04-4347-810b-dffe41887f83\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p6zqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:28:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xvw8s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:19Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.124674 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e098c043-2e79-4678-bc14-4306571d12df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c3d9b76028a6b1f6b025ecd7227387c6ac179e613bb01e8d8d2947a88be0515\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71912e3bacf35053ffa1c8590378aa9a0c88319533d888b0d191e4bce05ae764\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71e2aac540c8ebaf6eca7a56c30aa6f65c2c637c7efdfab7999d74ffc2ecf4e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48aa6d8e0f00ddf9a6fdef1b8ae1ee9ff101082f5e7d871c81beaa68344edade\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1a143c7481f264da37aeab778a53b3ba35fa1c2aa6a5111aa105283a82be44d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0110 16:28:10.988176 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0110 16:28:10.989232 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1614077472/tls.crt::/tmp/serving-cert-1614077472/tls.key\\\\\\\"\\\\nI0110 16:28:16.535521 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0110 16:28:16.537721 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0110 16:28:16.537744 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0110 16:28:16.537779 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0110 16:28:16.537791 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0110 16:28:16.543658 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0110 16:28:16.543825 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0110 16:28:16.543843 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0110 16:28:16.543855 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0110 16:28:16.543871 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0110 16:28:16.543878 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0110 16:28:16.543885 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0110 16:28:16.544325 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0110 16:28:16.545067 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6164714519a51fd12d13bbf0c74e2ed910fe7e9fb5fc21b0476fa946fc54c3bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-10T16:27:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:27:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:19Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.131952 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:19 crc kubenswrapper[5036]: E0110 16:28:19.132086 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:21.132067593 +0000 UTC m=+23.002303087 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132116 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-cni-netd\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132140 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-host-run-k8s-cni-cncf-io\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132160 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d86dc\" (UniqueName: \"kubernetes.io/projected/91a78516-865b-40eb-8545-8f24206fe927-kube-api-access-d86dc\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132179 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-etc-openvswitch\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132195 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-run-ovn-kubernetes\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132213 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-run-ovn\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132237 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b65c4\" (UniqueName: \"kubernetes.io/projected/98fa8c41-2298-4b26-849a-806cc77bcc40-kube-api-access-b65c4\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132252 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-host-run-netns\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132268 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c55fd5c5-5f04-4347-810b-dffe41887f83-hosts-file\") pod \"node-resolver-xvw8s\" (UID: \"c55fd5c5-5f04-4347-810b-dffe41887f83\") " pod="openshift-dns/node-resolver-xvw8s" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132284 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqzrp\" (UniqueName: \"kubernetes.io/projected/79756361-741e-4470-831b-6ee092bc6277-kube-api-access-hqzrp\") pod \"machine-config-daemon-kqphb\" (UID: \"79756361-741e-4470-831b-6ee092bc6277\") " pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132300 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-multus-socket-dir-parent\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132318 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-run-netns\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132336 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/baa66345-69f5-4a8c-b0fd-c28f048c239b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5kmzz\" (UID: \"baa66345-69f5-4a8c-b0fd-c28f048c239b\") " pod="openshift-multus/multus-additional-cni-plugins-5kmzz" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132352 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-host-var-lib-cni-bin\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132368 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-multus-conf-dir\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132384 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-var-lib-openvswitch\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132401 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-run-openvswitch\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132417 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/98fa8c41-2298-4b26-849a-806cc77bcc40-env-overrides\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132436 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79756361-741e-4470-831b-6ee092bc6277-proxy-tls\") pod \"machine-config-daemon-kqphb\" (UID: \"79756361-741e-4470-831b-6ee092bc6277\") " pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132452 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/baa66345-69f5-4a8c-b0fd-c28f048c239b-cnibin\") pod \"multus-additional-cni-plugins-5kmzz\" (UID: \"baa66345-69f5-4a8c-b0fd-c28f048c239b\") " pod="openshift-multus/multus-additional-cni-plugins-5kmzz" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132467 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/baa66345-69f5-4a8c-b0fd-c28f048c239b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5kmzz\" (UID: \"baa66345-69f5-4a8c-b0fd-c28f048c239b\") " pod="openshift-multus/multus-additional-cni-plugins-5kmzz" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132514 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-system-cni-dir\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132515 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-host-run-k8s-cni-cncf-io\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132529 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-multus-cni-dir\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132561 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-host-var-lib-cni-multus\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132584 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-etc-kubernetes\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132614 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-node-log\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132633 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-cni-bin\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132654 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-log-socket\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132673 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/baa66345-69f5-4a8c-b0fd-c28f048c239b-os-release\") pod \"multus-additional-cni-plugins-5kmzz\" (UID: \"baa66345-69f5-4a8c-b0fd-c28f048c239b\") " pod="openshift-multus/multus-additional-cni-plugins-5kmzz" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132710 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xdb6\" (UniqueName: \"kubernetes.io/projected/baa66345-69f5-4a8c-b0fd-c28f048c239b-kube-api-access-4xdb6\") pod \"multus-additional-cni-plugins-5kmzz\" (UID: \"baa66345-69f5-4a8c-b0fd-c28f048c239b\") " pod="openshift-multus/multus-additional-cni-plugins-5kmzz" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132718 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-multus-cni-dir\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132731 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-cnibin\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132750 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-systemd-units\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132770 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-slash\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132790 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/baa66345-69f5-4a8c-b0fd-c28f048c239b-cni-binary-copy\") pod \"multus-additional-cni-plugins-5kmzz\" (UID: \"baa66345-69f5-4a8c-b0fd-c28f048c239b\") " pod="openshift-multus/multus-additional-cni-plugins-5kmzz" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132814 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132836 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/98fa8c41-2298-4b26-849a-806cc77bcc40-ovn-node-metrics-cert\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132858 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/98fa8c41-2298-4b26-849a-806cc77bcc40-ovnkube-script-lib\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132861 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-etc-openvswitch\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132904 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/79756361-741e-4470-831b-6ee092bc6277-rootfs\") pod \"machine-config-daemon-kqphb\" (UID: \"79756361-741e-4470-831b-6ee092bc6277\") " pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132954 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-etc-kubernetes\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132968 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-host-var-lib-cni-multus\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132990 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-node-log\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.133014 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-host-var-lib-cni-bin\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.133028 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-cni-bin\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.133057 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-multus-conf-dir\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.133064 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-log-socket\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.133108 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-var-lib-openvswitch\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.133144 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-run-openvswitch\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.133323 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/baa66345-69f5-4a8c-b0fd-c28f048c239b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5kmzz\" (UID: \"baa66345-69f5-4a8c-b0fd-c28f048c239b\") " pod="openshift-multus/multus-additional-cni-plugins-5kmzz" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.133344 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/baa66345-69f5-4a8c-b0fd-c28f048c239b-os-release\") pod \"multus-additional-cni-plugins-5kmzz\" (UID: \"baa66345-69f5-4a8c-b0fd-c28f048c239b\") " pod="openshift-multus/multus-additional-cni-plugins-5kmzz" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.133364 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-run-ovn-kubernetes\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.133387 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-run-ovn\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132212 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-cni-netd\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.133553 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-cnibin\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.133585 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-multus-socket-dir-parent\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.133616 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-run-netns\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.133596 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-systemd-units\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.132879 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/79756361-741e-4470-831b-6ee092bc6277-rootfs\") pod \"machine-config-daemon-kqphb\" (UID: \"79756361-741e-4470-831b-6ee092bc6277\") " pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.133803 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-slash\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.133820 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/baa66345-69f5-4a8c-b0fd-c28f048c239b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5kmzz\" (UID: \"baa66345-69f5-4a8c-b0fd-c28f048c239b\") " pod="openshift-multus/multus-additional-cni-plugins-5kmzz" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.133836 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-system-cni-dir\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.133826 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c55fd5c5-5f04-4347-810b-dffe41887f83-hosts-file\") pod \"node-resolver-xvw8s\" (UID: \"c55fd5c5-5f04-4347-810b-dffe41887f83\") " pod="openshift-dns/node-resolver-xvw8s" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.133869 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-os-release\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.133889 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-hostroot\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.133939 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-os-release\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.133962 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-hostroot\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.133986 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.134003 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-run-systemd\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.134040 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/98fa8c41-2298-4b26-849a-806cc77bcc40-ovnkube-config\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.134056 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/91a78516-865b-40eb-8545-8f24206fe927-cni-binary-copy\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.134072 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-host-var-lib-kubelet\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.134090 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.134108 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-kubelet\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.134125 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.134188 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6zqs\" (UniqueName: \"kubernetes.io/projected/c55fd5c5-5f04-4347-810b-dffe41887f83-kube-api-access-p6zqs\") pod \"node-resolver-xvw8s\" (UID: \"c55fd5c5-5f04-4347-810b-dffe41887f83\") " pod="openshift-dns/node-resolver-xvw8s" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.134208 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/79756361-741e-4470-831b-6ee092bc6277-mcd-auth-proxy-config\") pod \"machine-config-daemon-kqphb\" (UID: \"79756361-741e-4470-831b-6ee092bc6277\") " pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.134225 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/baa66345-69f5-4a8c-b0fd-c28f048c239b-system-cni-dir\") pod \"multus-additional-cni-plugins-5kmzz\" (UID: \"baa66345-69f5-4a8c-b0fd-c28f048c239b\") " pod="openshift-multus/multus-additional-cni-plugins-5kmzz" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.134241 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-host-run-multus-certs\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.134258 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/91a78516-865b-40eb-8545-8f24206fe927-multus-daemon-config\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: E0110 16:28:19.134524 5036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.134557 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/baa66345-69f5-4a8c-b0fd-c28f048c239b-cnibin\") pod \"multus-additional-cni-plugins-5kmzz\" (UID: \"baa66345-69f5-4a8c-b0fd-c28f048c239b\") " pod="openshift-multus/multus-additional-cni-plugins-5kmzz" Jan 10 16:28:19 crc kubenswrapper[5036]: E0110 16:28:19.134569 5036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 10 16:28:19 crc kubenswrapper[5036]: E0110 16:28:19.134589 5036 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.134594 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-host-var-lib-kubelet\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.134535 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/baa66345-69f5-4a8c-b0fd-c28f048c239b-cni-binary-copy\") pod \"multus-additional-cni-plugins-5kmzz\" (UID: \"baa66345-69f5-4a8c-b0fd-c28f048c239b\") " pod="openshift-multus/multus-additional-cni-plugins-5kmzz" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.134647 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/98fa8c41-2298-4b26-849a-806cc77bcc40-ovnkube-script-lib\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.134623 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-host-run-netns\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.134654 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-run-systemd\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: E0110 16:28:19.134750 5036 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 10 16:28:19 crc kubenswrapper[5036]: E0110 16:28:19.134751 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-10 16:28:21.134725075 +0000 UTC m=+23.004960569 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 10 16:28:19 crc kubenswrapper[5036]: E0110 16:28:19.134834 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-10 16:28:21.134801537 +0000 UTC m=+23.005037171 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.134866 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.134894 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/baa66345-69f5-4a8c-b0fd-c28f048c239b-system-cni-dir\") pod \"multus-additional-cni-plugins-5kmzz\" (UID: \"baa66345-69f5-4a8c-b0fd-c28f048c239b\") " pod="openshift-multus/multus-additional-cni-plugins-5kmzz" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.134894 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-kubelet\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.134987 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/91a78516-865b-40eb-8545-8f24206fe927-multus-daemon-config\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.135171 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/91a78516-865b-40eb-8545-8f24206fe927-host-run-multus-certs\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.135215 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/91a78516-865b-40eb-8545-8f24206fe927-cni-binary-copy\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: E0110 16:28:19.135230 5036 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 10 16:28:19 crc kubenswrapper[5036]: E0110 16:28:19.135311 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-10 16:28:21.13529052 +0000 UTC m=+23.005526014 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.135652 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/79756361-741e-4470-831b-6ee092bc6277-mcd-auth-proxy-config\") pod \"machine-config-daemon-kqphb\" (UID: \"79756361-741e-4470-831b-6ee092bc6277\") " pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.135875 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/98fa8c41-2298-4b26-849a-806cc77bcc40-env-overrides\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.135943 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/98fa8c41-2298-4b26-849a-806cc77bcc40-ovnkube-config\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.144056 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5e4a161-5178-43cf-92a8-f0342e478934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0547d30d385cb9ff12471f7e5474640eca3c2f9f5ae8a39c751c2f650c3fc6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e08b53a3d87683275ba0e4ee4b22dd9929741e17a4e2246e68900bc15ab73dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72636dcf4fe3412c63af96e24daec814e37772b8760300f122d699a22efe67c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b6bfba50cee7c3e324ec14bc78b6165e04b2f8c3a4878bab6c9a19ec014e458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:27:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:19Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.144142 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79756361-741e-4470-831b-6ee092bc6277-proxy-tls\") pod \"machine-config-daemon-kqphb\" (UID: \"79756361-741e-4470-831b-6ee092bc6277\") " pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.144702 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/98fa8c41-2298-4b26-849a-806cc77bcc40-ovn-node-metrics-cert\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.150612 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b65c4\" (UniqueName: \"kubernetes.io/projected/98fa8c41-2298-4b26-849a-806cc77bcc40-kube-api-access-b65c4\") pod \"ovnkube-node-c4vw5\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.151191 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6zqs\" (UniqueName: \"kubernetes.io/projected/c55fd5c5-5f04-4347-810b-dffe41887f83-kube-api-access-p6zqs\") pod \"node-resolver-xvw8s\" (UID: \"c55fd5c5-5f04-4347-810b-dffe41887f83\") " pod="openshift-dns/node-resolver-xvw8s" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.152612 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d86dc\" (UniqueName: \"kubernetes.io/projected/91a78516-865b-40eb-8545-8f24206fe927-kube-api-access-d86dc\") pod \"multus-44nd6\" (UID: \"91a78516-865b-40eb-8545-8f24206fe927\") " pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.153667 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqzrp\" (UniqueName: \"kubernetes.io/projected/79756361-741e-4470-831b-6ee092bc6277-kube-api-access-hqzrp\") pod \"machine-config-daemon-kqphb\" (UID: \"79756361-741e-4470-831b-6ee092bc6277\") " pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.182408 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e72a907146fd8c205f540b2b55f53297d2cec24b13e618eed56140586885a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:19Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.186197 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.235399 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 10 16:28:19 crc kubenswrapper[5036]: E0110 16:28:19.235593 5036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 10 16:28:19 crc kubenswrapper[5036]: E0110 16:28:19.235617 5036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 10 16:28:19 crc kubenswrapper[5036]: E0110 16:28:19.235630 5036 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 10 16:28:19 crc kubenswrapper[5036]: E0110 16:28:19.235705 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-10 16:28:21.235671266 +0000 UTC m=+23.105906760 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.238947 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://262fa2b49bab0c2eb3bdfe5bffaa593a2b57205dd9ab36e11a772ed3cfb9dace\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fb72f484c0faa9d7d3f02b2aa4449ffa12a47eb95f24cf13788793b9c876693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:19Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.245395 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.255084 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xdb6\" (UniqueName: \"kubernetes.io/projected/baa66345-69f5-4a8c-b0fd-c28f048c239b-kube-api-access-4xdb6\") pod \"multus-additional-cni-plugins-5kmzz\" (UID: \"baa66345-69f5-4a8c-b0fd-c28f048c239b\") " pod="openshift-multus/multus-additional-cni-plugins-5kmzz" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.266264 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.288223 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xvw8s" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.303334 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5kmzz" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.305760 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.310897 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.316501 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.322110 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-44nd6" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.327131 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.357319 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:19Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:19 crc kubenswrapper[5036]: W0110 16:28:19.373458 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91a78516_865b_40eb_8545_8f24206fe927.slice/crio-2f763f1f90bad9c730902d0f9db97c46788d8566c4c2742f8b7a776b0ebbc43b WatchSource:0}: Error finding container 2f763f1f90bad9c730902d0f9db97c46788d8566c4c2742f8b7a776b0ebbc43b: Status 404 returned error can't find the container with id 2f763f1f90bad9c730902d0f9db97c46788d8566c4c2742f8b7a776b0ebbc43b Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.400228 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:19Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.406078 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.457988 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:19Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.497342 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:19Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.506103 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.507152 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.507246 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 10 16:28:19 crc kubenswrapper[5036]: E0110 16:28:19.507294 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 10 16:28:19 crc kubenswrapper[5036]: E0110 16:28:19.507429 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.525565 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.575785 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79756361-741e-4470-831b-6ee092bc6277\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqzrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hqzrp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:28:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kqphb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:19Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.626601 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.628274 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98fa8c41-2298-4b26-849a-806cc77bcc40\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b65c4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b65c4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b65c4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b65c4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b65c4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b65c4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b65c4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b65c4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b65c4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:28:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-c4vw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:19Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.674607 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7q49q" event={"ID":"a6ef98aa-d06f-44d8-a96f-8c261e2521b1","Type":"ContainerStarted","Data":"f768ebae1378a35622cc3e8fbb13b678ed0ddc027c7619f776060a0ebe490afe"} Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.676321 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kmzz" event={"ID":"baa66345-69f5-4a8c-b0fd-c28f048c239b","Type":"ContainerStarted","Data":"0d396635d8d6007a89c41d19e70088e3f0cc69ddc8457967edd9d3d0c2a024a5"} Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.677335 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kmzz" event={"ID":"baa66345-69f5-4a8c-b0fd-c28f048c239b","Type":"ContainerStarted","Data":"6633546bdab9e108b34370f80ecb2923008308d8d5e3b0af49c4bafe70435cec"} Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.678197 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-44nd6" event={"ID":"91a78516-865b-40eb-8545-8f24206fe927","Type":"ContainerStarted","Data":"4acfe31af1f4adc287a0f51bd888e7ad0d76662326dfa32e1fd87c6efcddb7f9"} Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.678227 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-44nd6" event={"ID":"91a78516-865b-40eb-8545-8f24206fe927","Type":"ContainerStarted","Data":"2f763f1f90bad9c730902d0f9db97c46788d8566c4c2742f8b7a776b0ebbc43b"} Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.678359 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5e4a161-5178-43cf-92a8-f0342e478934\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-10T16:27:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0547d30d385cb9ff12471f7e5474640eca3c2f9f5ae8a39c751c2f650c3fc6a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e08b53a3d87683275ba0e4ee4b22dd9929741e17a4e2246e68900bc15ab73dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72636dcf4fe3412c63af96e24daec814e37772b8760300f122d699a22efe67c9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:27:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b6bfba50cee7c3e324ec14bc78b6165e04b2f8c3a4878bab6c9a19ec014e458\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-10T16:27:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:19Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.679598 5036 generic.go:334] "Generic (PLEG): container finished" podID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerID="8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31" exitCode=0 Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.679652 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" event={"ID":"98fa8c41-2298-4b26-849a-806cc77bcc40","Type":"ContainerDied","Data":"8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31"} Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.679688 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" event={"ID":"98fa8c41-2298-4b26-849a-806cc77bcc40","Type":"ContainerStarted","Data":"7c56941243f558ebeb0929b024f0db8f4f29f0fdf7b48537f8ec36e00cdbf1c7"} Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.682293 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerStarted","Data":"bd482150cf86ffe8b966604320efadab6529ba4dfb0ed9c01dbe269d8fea217c"} Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.682371 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerStarted","Data":"aae30e525ba7b9a8f43d42033f9ba0d3065ee1415e836584cee9ed215de60e5f"} Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.682385 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerStarted","Data":"34621f4378ddf17fed1253cd5a5f39569852fa5cab9fbd36d2acc2f7a4bcbd4f"} Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.685019 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xvw8s" event={"ID":"c55fd5c5-5f04-4347-810b-dffe41887f83","Type":"ContainerStarted","Data":"3192ba9963ef456b233e903a7a23fafbeb001d166854b117f9000140fdaf5567"} Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.685076 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xvw8s" event={"ID":"c55fd5c5-5f04-4347-810b-dffe41887f83","Type":"ContainerStarted","Data":"4f1bc50d66458e00233eb4405af4d76ef763b0df607345c0050ac3dc04d711fd"} Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.718189 5036 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-10T16:28:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90e72a907146fd8c205f540b2b55f53297d2cec24b13e618eed56140586885a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-10T16:28:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-10T16:28:19Z is after 2025-08-24T17:21:41Z" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.728299 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.734757 5036 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.738519 5036 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.738575 5036 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.738587 5036 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.738769 5036 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.808155 5036 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.808498 5036 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.809828 5036 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.809874 5036 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.809888 5036 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.809908 5036 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.809920 5036 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-10T16:28:19Z","lastTransitionTime":"2026-01-10T16:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.830341 5036 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.830381 5036 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.830391 5036 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.830410 5036 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.830422 5036 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-10T16:28:19Z","lastTransitionTime":"2026-01-10T16:28:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.845877 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.886553 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 10 16:28:19 crc kubenswrapper[5036]: I0110 16:28:19.906846 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.114245 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=4.114211304 podStartE2EDuration="4.114211304s" podCreationTimestamp="2026-01-10 16:28:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:20.078630232 +0000 UTC m=+21.948865736" watchObservedRunningTime="2026-01-10 16:28:20.114211304 +0000 UTC m=+21.984446798" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.206023 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c64zp"] Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.207422 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c64zp" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.238063 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-lzkzv"] Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.238579 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzkzv" Jan 10 16:28:20 crc kubenswrapper[5036]: E0110 16:28:20.238654 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzkzv" podUID="b4ede2a2-1cff-4d29-8b81-16de7162b5fe" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.245887 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.265963 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.345408 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ad7efc6c-744c-4f32-b2db-32d54f0b630f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-c64zp\" (UID: \"ad7efc6c-744c-4f32-b2db-32d54f0b630f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c64zp" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.345454 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ad7efc6c-744c-4f32-b2db-32d54f0b630f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-c64zp\" (UID: \"ad7efc6c-744c-4f32-b2db-32d54f0b630f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c64zp" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.345495 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzs7v\" (UniqueName: \"kubernetes.io/projected/b4ede2a2-1cff-4d29-8b81-16de7162b5fe-kube-api-access-xzs7v\") pod \"network-metrics-daemon-lzkzv\" (UID: \"b4ede2a2-1cff-4d29-8b81-16de7162b5fe\") " pod="openshift-multus/network-metrics-daemon-lzkzv" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.345530 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad7efc6c-744c-4f32-b2db-32d54f0b630f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-c64zp\" (UID: \"ad7efc6c-744c-4f32-b2db-32d54f0b630f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c64zp" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.345569 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4ede2a2-1cff-4d29-8b81-16de7162b5fe-metrics-certs\") pod \"network-metrics-daemon-lzkzv\" (UID: \"b4ede2a2-1cff-4d29-8b81-16de7162b5fe\") " pod="openshift-multus/network-metrics-daemon-lzkzv" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.345590 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhmzg\" (UniqueName: \"kubernetes.io/projected/ad7efc6c-744c-4f32-b2db-32d54f0b630f-kube-api-access-dhmzg\") pod \"ovnkube-control-plane-749d76644c-c64zp\" (UID: \"ad7efc6c-744c-4f32-b2db-32d54f0b630f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c64zp" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.398467 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=4.398443214 podStartE2EDuration="4.398443214s" podCreationTimestamp="2026-01-10 16:28:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:20.357108106 +0000 UTC m=+22.227343610" watchObservedRunningTime="2026-01-10 16:28:20.398443214 +0000 UTC m=+22.268678708" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.438101 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podStartSLOduration=2.438075426 podStartE2EDuration="2.438075426s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:20.437955683 +0000 UTC m=+22.308191187" watchObservedRunningTime="2026-01-10 16:28:20.438075426 +0000 UTC m=+22.308310920" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.446713 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ad7efc6c-744c-4f32-b2db-32d54f0b630f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-c64zp\" (UID: \"ad7efc6c-744c-4f32-b2db-32d54f0b630f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c64zp" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.446773 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ad7efc6c-744c-4f32-b2db-32d54f0b630f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-c64zp\" (UID: \"ad7efc6c-744c-4f32-b2db-32d54f0b630f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c64zp" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.446797 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzs7v\" (UniqueName: \"kubernetes.io/projected/b4ede2a2-1cff-4d29-8b81-16de7162b5fe-kube-api-access-xzs7v\") pod \"network-metrics-daemon-lzkzv\" (UID: \"b4ede2a2-1cff-4d29-8b81-16de7162b5fe\") " pod="openshift-multus/network-metrics-daemon-lzkzv" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.446818 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad7efc6c-744c-4f32-b2db-32d54f0b630f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-c64zp\" (UID: \"ad7efc6c-744c-4f32-b2db-32d54f0b630f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c64zp" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.446852 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4ede2a2-1cff-4d29-8b81-16de7162b5fe-metrics-certs\") pod \"network-metrics-daemon-lzkzv\" (UID: \"b4ede2a2-1cff-4d29-8b81-16de7162b5fe\") " pod="openshift-multus/network-metrics-daemon-lzkzv" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.446873 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhmzg\" (UniqueName: \"kubernetes.io/projected/ad7efc6c-744c-4f32-b2db-32d54f0b630f-kube-api-access-dhmzg\") pod \"ovnkube-control-plane-749d76644c-c64zp\" (UID: \"ad7efc6c-744c-4f32-b2db-32d54f0b630f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c64zp" Jan 10 16:28:20 crc kubenswrapper[5036]: E0110 16:28:20.447209 5036 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 10 16:28:20 crc kubenswrapper[5036]: E0110 16:28:20.447297 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4ede2a2-1cff-4d29-8b81-16de7162b5fe-metrics-certs podName:b4ede2a2-1cff-4d29-8b81-16de7162b5fe nodeName:}" failed. No retries permitted until 2026-01-10 16:28:20.947280545 +0000 UTC m=+22.817516039 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4ede2a2-1cff-4d29-8b81-16de7162b5fe-metrics-certs") pod "network-metrics-daemon-lzkzv" (UID: "b4ede2a2-1cff-4d29-8b81-16de7162b5fe") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.447486 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ad7efc6c-744c-4f32-b2db-32d54f0b630f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-c64zp\" (UID: \"ad7efc6c-744c-4f32-b2db-32d54f0b630f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c64zp" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.447778 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ad7efc6c-744c-4f32-b2db-32d54f0b630f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-c64zp\" (UID: \"ad7efc6c-744c-4f32-b2db-32d54f0b630f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c64zp" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.461801 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad7efc6c-744c-4f32-b2db-32d54f0b630f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-c64zp\" (UID: \"ad7efc6c-744c-4f32-b2db-32d54f0b630f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c64zp" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.488901 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhmzg\" (UniqueName: \"kubernetes.io/projected/ad7efc6c-744c-4f32-b2db-32d54f0b630f-kube-api-access-dhmzg\") pod \"ovnkube-control-plane-749d76644c-c64zp\" (UID: \"ad7efc6c-744c-4f32-b2db-32d54f0b630f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c64zp" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.499275 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzs7v\" (UniqueName: \"kubernetes.io/projected/b4ede2a2-1cff-4d29-8b81-16de7162b5fe-kube-api-access-xzs7v\") pod \"network-metrics-daemon-lzkzv\" (UID: \"b4ede2a2-1cff-4d29-8b81-16de7162b5fe\") " pod="openshift-multus/network-metrics-daemon-lzkzv" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.512242 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 10 16:28:20 crc kubenswrapper[5036]: E0110 16:28:20.512413 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.552586 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7q49q" podStartSLOduration=2.552561714 podStartE2EDuration="2.552561714s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:20.552478751 +0000 UTC m=+22.422714245" watchObservedRunningTime="2026-01-10 16:28:20.552561714 +0000 UTC m=+22.422797218" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.553655 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-44nd6" podStartSLOduration=2.553628873 podStartE2EDuration="2.553628873s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:20.523553079 +0000 UTC m=+22.393788583" watchObservedRunningTime="2026-01-10 16:28:20.553628873 +0000 UTC m=+22.423864387" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.602736 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-xvw8s" podStartSLOduration=2.60270575 podStartE2EDuration="2.60270575s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:20.602282329 +0000 UTC m=+22.472517823" watchObservedRunningTime="2026-01-10 16:28:20.60270575 +0000 UTC m=+22.472941254" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.678313 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c64zp" Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.689093 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"38788150658a0986730cf262ef4c34e8c700cef4e476165ed31f18978db77587"} Jan 10 16:28:20 crc kubenswrapper[5036]: W0110 16:28:20.693048 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad7efc6c_744c_4f32_b2db_32d54f0b630f.slice/crio-c9992842e81e8589d53e60a816141e0ec5b5346ef9ad15cf4f22a5349b1bbe76 WatchSource:0}: Error finding container c9992842e81e8589d53e60a816141e0ec5b5346ef9ad15cf4f22a5349b1bbe76: Status 404 returned error can't find the container with id c9992842e81e8589d53e60a816141e0ec5b5346ef9ad15cf4f22a5349b1bbe76 Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.693068 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" event={"ID":"98fa8c41-2298-4b26-849a-806cc77bcc40","Type":"ContainerStarted","Data":"a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a"} Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.693148 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" event={"ID":"98fa8c41-2298-4b26-849a-806cc77bcc40","Type":"ContainerStarted","Data":"895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c"} Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.693181 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" event={"ID":"98fa8c41-2298-4b26-849a-806cc77bcc40","Type":"ContainerStarted","Data":"52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c"} Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.693196 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" event={"ID":"98fa8c41-2298-4b26-849a-806cc77bcc40","Type":"ContainerStarted","Data":"450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6"} Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.694572 5036 generic.go:334] "Generic (PLEG): container finished" podID="baa66345-69f5-4a8c-b0fd-c28f048c239b" containerID="0d396635d8d6007a89c41d19e70088e3f0cc69ddc8457967edd9d3d0c2a024a5" exitCode=0 Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.695018 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kmzz" event={"ID":"baa66345-69f5-4a8c-b0fd-c28f048c239b","Type":"ContainerDied","Data":"0d396635d8d6007a89c41d19e70088e3f0cc69ddc8457967edd9d3d0c2a024a5"} Jan 10 16:28:20 crc kubenswrapper[5036]: I0110 16:28:20.959048 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4ede2a2-1cff-4d29-8b81-16de7162b5fe-metrics-certs\") pod \"network-metrics-daemon-lzkzv\" (UID: \"b4ede2a2-1cff-4d29-8b81-16de7162b5fe\") " pod="openshift-multus/network-metrics-daemon-lzkzv" Jan 10 16:28:20 crc kubenswrapper[5036]: E0110 16:28:20.959296 5036 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 10 16:28:20 crc kubenswrapper[5036]: E0110 16:28:20.959391 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4ede2a2-1cff-4d29-8b81-16de7162b5fe-metrics-certs podName:b4ede2a2-1cff-4d29-8b81-16de7162b5fe nodeName:}" failed. No retries permitted until 2026-01-10 16:28:21.959373949 +0000 UTC m=+23.829609443 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4ede2a2-1cff-4d29-8b81-16de7162b5fe-metrics-certs") pod "network-metrics-daemon-lzkzv" (UID: "b4ede2a2-1cff-4d29-8b81-16de7162b5fe") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.161648 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:21 crc kubenswrapper[5036]: E0110 16:28:21.161845 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:25.161817636 +0000 UTC m=+27.032053140 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.161896 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.161930 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.161953 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 10 16:28:21 crc kubenswrapper[5036]: E0110 16:28:21.162042 5036 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 10 16:28:21 crc kubenswrapper[5036]: E0110 16:28:21.162044 5036 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 10 16:28:21 crc kubenswrapper[5036]: E0110 16:28:21.162054 5036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 10 16:28:21 crc kubenswrapper[5036]: E0110 16:28:21.162116 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-10 16:28:25.162104454 +0000 UTC m=+27.032339948 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 10 16:28:21 crc kubenswrapper[5036]: E0110 16:28:21.162123 5036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 10 16:28:21 crc kubenswrapper[5036]: E0110 16:28:21.162134 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-10 16:28:25.162127234 +0000 UTC m=+27.032362738 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 10 16:28:21 crc kubenswrapper[5036]: E0110 16:28:21.162140 5036 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 10 16:28:21 crc kubenswrapper[5036]: E0110 16:28:21.162193 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-10 16:28:25.162180766 +0000 UTC m=+27.032416260 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.262899 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 10 16:28:21 crc kubenswrapper[5036]: E0110 16:28:21.263118 5036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 10 16:28:21 crc kubenswrapper[5036]: E0110 16:28:21.263150 5036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 10 16:28:21 crc kubenswrapper[5036]: E0110 16:28:21.263166 5036 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 10 16:28:21 crc kubenswrapper[5036]: E0110 16:28:21.263233 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-10 16:28:25.263213409 +0000 UTC m=+27.133448903 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.507964 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.508003 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzkzv" Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.508421 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 10 16:28:21 crc kubenswrapper[5036]: E0110 16:28:21.508529 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 10 16:28:21 crc kubenswrapper[5036]: E0110 16:28:21.508775 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 10 16:28:21 crc kubenswrapper[5036]: E0110 16:28:21.508850 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzkzv" podUID="b4ede2a2-1cff-4d29-8b81-16de7162b5fe" Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.702040 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c64zp" event={"ID":"ad7efc6c-744c-4f32-b2db-32d54f0b630f","Type":"ContainerStarted","Data":"1b26a348d860821bd43527e71a2eb995f34a7ccf21f61a12499566b522066521"} Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.702120 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c64zp" event={"ID":"ad7efc6c-744c-4f32-b2db-32d54f0b630f","Type":"ContainerStarted","Data":"79ba2fee9d025dfb17651ab433ea955154a22afcd8cd82283923e4dadbe11d94"} Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.702147 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c64zp" event={"ID":"ad7efc6c-744c-4f32-b2db-32d54f0b630f","Type":"ContainerStarted","Data":"c9992842e81e8589d53e60a816141e0ec5b5346ef9ad15cf4f22a5349b1bbe76"} Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.703868 5036 generic.go:334] "Generic (PLEG): container finished" podID="baa66345-69f5-4a8c-b0fd-c28f048c239b" containerID="981f99b8656ef32d411cde918227985034517b651ad57ee48be54931760dc6c5" exitCode=0 Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.704008 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kmzz" event={"ID":"baa66345-69f5-4a8c-b0fd-c28f048c239b","Type":"ContainerDied","Data":"981f99b8656ef32d411cde918227985034517b651ad57ee48be54931760dc6c5"} Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.712656 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" event={"ID":"98fa8c41-2298-4b26-849a-806cc77bcc40","Type":"ContainerStarted","Data":"2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1"} Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.713134 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" event={"ID":"98fa8c41-2298-4b26-849a-806cc77bcc40","Type":"ContainerStarted","Data":"12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e"} Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.714984 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-5sc9h"] Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.715452 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5sc9h" Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.717246 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.717715 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.717780 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.718307 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.756276 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c64zp" podStartSLOduration=3.7562543379999997 podStartE2EDuration="3.756254338s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:21.735414095 +0000 UTC m=+23.605649589" watchObservedRunningTime="2026-01-10 16:28:21.756254338 +0000 UTC m=+23.626489842" Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.872218 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7f2cb403-c09f-404f-b5db-ce99e62dc6db-service-ca\") pod \"cluster-version-operator-5c965bbfc6-5sc9h\" (UID: \"7f2cb403-c09f-404f-b5db-ce99e62dc6db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5sc9h" Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.872408 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7f2cb403-c09f-404f-b5db-ce99e62dc6db-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-5sc9h\" (UID: \"7f2cb403-c09f-404f-b5db-ce99e62dc6db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5sc9h" Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.872451 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f2cb403-c09f-404f-b5db-ce99e62dc6db-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-5sc9h\" (UID: \"7f2cb403-c09f-404f-b5db-ce99e62dc6db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5sc9h" Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.872522 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7f2cb403-c09f-404f-b5db-ce99e62dc6db-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-5sc9h\" (UID: \"7f2cb403-c09f-404f-b5db-ce99e62dc6db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5sc9h" Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.872762 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7f2cb403-c09f-404f-b5db-ce99e62dc6db-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-5sc9h\" (UID: \"7f2cb403-c09f-404f-b5db-ce99e62dc6db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5sc9h" Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.975318 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7f2cb403-c09f-404f-b5db-ce99e62dc6db-service-ca\") pod \"cluster-version-operator-5c965bbfc6-5sc9h\" (UID: \"7f2cb403-c09f-404f-b5db-ce99e62dc6db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5sc9h" Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.975371 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7f2cb403-c09f-404f-b5db-ce99e62dc6db-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-5sc9h\" (UID: \"7f2cb403-c09f-404f-b5db-ce99e62dc6db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5sc9h" Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.975400 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f2cb403-c09f-404f-b5db-ce99e62dc6db-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-5sc9h\" (UID: \"7f2cb403-c09f-404f-b5db-ce99e62dc6db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5sc9h" Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.975436 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7f2cb403-c09f-404f-b5db-ce99e62dc6db-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-5sc9h\" (UID: \"7f2cb403-c09f-404f-b5db-ce99e62dc6db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5sc9h" Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.975469 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4ede2a2-1cff-4d29-8b81-16de7162b5fe-metrics-certs\") pod \"network-metrics-daemon-lzkzv\" (UID: \"b4ede2a2-1cff-4d29-8b81-16de7162b5fe\") " pod="openshift-multus/network-metrics-daemon-lzkzv" Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.975512 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7f2cb403-c09f-404f-b5db-ce99e62dc6db-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-5sc9h\" (UID: \"7f2cb403-c09f-404f-b5db-ce99e62dc6db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5sc9h" Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.975600 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7f2cb403-c09f-404f-b5db-ce99e62dc6db-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-5sc9h\" (UID: \"7f2cb403-c09f-404f-b5db-ce99e62dc6db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5sc9h" Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.975843 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7f2cb403-c09f-404f-b5db-ce99e62dc6db-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-5sc9h\" (UID: \"7f2cb403-c09f-404f-b5db-ce99e62dc6db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5sc9h" Jan 10 16:28:21 crc kubenswrapper[5036]: E0110 16:28:21.976008 5036 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 10 16:28:21 crc kubenswrapper[5036]: E0110 16:28:21.976143 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4ede2a2-1cff-4d29-8b81-16de7162b5fe-metrics-certs podName:b4ede2a2-1cff-4d29-8b81-16de7162b5fe nodeName:}" failed. No retries permitted until 2026-01-10 16:28:23.976112857 +0000 UTC m=+25.846348351 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4ede2a2-1cff-4d29-8b81-16de7162b5fe-metrics-certs") pod "network-metrics-daemon-lzkzv" (UID: "b4ede2a2-1cff-4d29-8b81-16de7162b5fe") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.976336 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7f2cb403-c09f-404f-b5db-ce99e62dc6db-service-ca\") pod \"cluster-version-operator-5c965bbfc6-5sc9h\" (UID: \"7f2cb403-c09f-404f-b5db-ce99e62dc6db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5sc9h" Jan 10 16:28:21 crc kubenswrapper[5036]: I0110 16:28:21.983456 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f2cb403-c09f-404f-b5db-ce99e62dc6db-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-5sc9h\" (UID: \"7f2cb403-c09f-404f-b5db-ce99e62dc6db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5sc9h" Jan 10 16:28:22 crc kubenswrapper[5036]: I0110 16:28:22.001475 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7f2cb403-c09f-404f-b5db-ce99e62dc6db-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-5sc9h\" (UID: \"7f2cb403-c09f-404f-b5db-ce99e62dc6db\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5sc9h" Jan 10 16:28:22 crc kubenswrapper[5036]: I0110 16:28:22.056072 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5sc9h" Jan 10 16:28:22 crc kubenswrapper[5036]: W0110 16:28:22.092801 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f2cb403_c09f_404f_b5db_ce99e62dc6db.slice/crio-0c91a570fff9fbd0bc37c7aede75ea142c677924aafe44311239a77cd9d2e0f6 WatchSource:0}: Error finding container 0c91a570fff9fbd0bc37c7aede75ea142c677924aafe44311239a77cd9d2e0f6: Status 404 returned error can't find the container with id 0c91a570fff9fbd0bc37c7aede75ea142c677924aafe44311239a77cd9d2e0f6 Jan 10 16:28:22 crc kubenswrapper[5036]: I0110 16:28:22.507404 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 10 16:28:22 crc kubenswrapper[5036]: E0110 16:28:22.507589 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 10 16:28:22 crc kubenswrapper[5036]: I0110 16:28:22.717815 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5sc9h" event={"ID":"7f2cb403-c09f-404f-b5db-ce99e62dc6db","Type":"ContainerStarted","Data":"f498b6ff28576a27326bddae5f273a8f78852ec222882c745a0f0b2658bbade1"} Jan 10 16:28:22 crc kubenswrapper[5036]: I0110 16:28:22.717940 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5sc9h" event={"ID":"7f2cb403-c09f-404f-b5db-ce99e62dc6db","Type":"ContainerStarted","Data":"0c91a570fff9fbd0bc37c7aede75ea142c677924aafe44311239a77cd9d2e0f6"} Jan 10 16:28:22 crc kubenswrapper[5036]: I0110 16:28:22.720770 5036 generic.go:334] "Generic (PLEG): container finished" podID="baa66345-69f5-4a8c-b0fd-c28f048c239b" containerID="f23d004ee59c26b2c841f88ad85b7160cbff062c588fce919b3c9952b601ab36" exitCode=0 Jan 10 16:28:22 crc kubenswrapper[5036]: I0110 16:28:22.720828 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kmzz" event={"ID":"baa66345-69f5-4a8c-b0fd-c28f048c239b","Type":"ContainerDied","Data":"f23d004ee59c26b2c841f88ad85b7160cbff062c588fce919b3c9952b601ab36"} Jan 10 16:28:22 crc kubenswrapper[5036]: I0110 16:28:22.769071 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-5sc9h" podStartSLOduration=4.769038839 podStartE2EDuration="4.769038839s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:22.735296656 +0000 UTC m=+24.605532230" watchObservedRunningTime="2026-01-10 16:28:22.769038839 +0000 UTC m=+24.639274353" Jan 10 16:28:23 crc kubenswrapper[5036]: I0110 16:28:23.507972 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzkzv" Jan 10 16:28:23 crc kubenswrapper[5036]: I0110 16:28:23.508027 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 10 16:28:23 crc kubenswrapper[5036]: I0110 16:28:23.508107 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 10 16:28:23 crc kubenswrapper[5036]: E0110 16:28:23.508159 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzkzv" podUID="b4ede2a2-1cff-4d29-8b81-16de7162b5fe" Jan 10 16:28:23 crc kubenswrapper[5036]: E0110 16:28:23.508310 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 10 16:28:23 crc kubenswrapper[5036]: E0110 16:28:23.508493 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 10 16:28:23 crc kubenswrapper[5036]: I0110 16:28:23.731120 5036 generic.go:334] "Generic (PLEG): container finished" podID="baa66345-69f5-4a8c-b0fd-c28f048c239b" containerID="246e27bbd727c75cae9ade46623240fa63a3b505f90bcccf4af90c184303fb54" exitCode=0 Jan 10 16:28:23 crc kubenswrapper[5036]: I0110 16:28:23.731183 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kmzz" event={"ID":"baa66345-69f5-4a8c-b0fd-c28f048c239b","Type":"ContainerDied","Data":"246e27bbd727c75cae9ade46623240fa63a3b505f90bcccf4af90c184303fb54"} Jan 10 16:28:23 crc kubenswrapper[5036]: I0110 16:28:23.740317 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" event={"ID":"98fa8c41-2298-4b26-849a-806cc77bcc40","Type":"ContainerStarted","Data":"d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402"} Jan 10 16:28:23 crc kubenswrapper[5036]: I0110 16:28:23.998195 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4ede2a2-1cff-4d29-8b81-16de7162b5fe-metrics-certs\") pod \"network-metrics-daemon-lzkzv\" (UID: \"b4ede2a2-1cff-4d29-8b81-16de7162b5fe\") " pod="openshift-multus/network-metrics-daemon-lzkzv" Jan 10 16:28:23 crc kubenswrapper[5036]: E0110 16:28:23.998370 5036 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 10 16:28:23 crc kubenswrapper[5036]: E0110 16:28:23.998425 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4ede2a2-1cff-4d29-8b81-16de7162b5fe-metrics-certs podName:b4ede2a2-1cff-4d29-8b81-16de7162b5fe nodeName:}" failed. No retries permitted until 2026-01-10 16:28:27.998405559 +0000 UTC m=+29.868641053 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4ede2a2-1cff-4d29-8b81-16de7162b5fe-metrics-certs") pod "network-metrics-daemon-lzkzv" (UID: "b4ede2a2-1cff-4d29-8b81-16de7162b5fe") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 10 16:28:24 crc kubenswrapper[5036]: I0110 16:28:24.507592 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 10 16:28:24 crc kubenswrapper[5036]: E0110 16:28:24.507851 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 10 16:28:24 crc kubenswrapper[5036]: I0110 16:28:24.747620 5036 generic.go:334] "Generic (PLEG): container finished" podID="baa66345-69f5-4a8c-b0fd-c28f048c239b" containerID="becaeb576370e92ff464675705295214f4bc8a6c21b1c8af1eef354e6f9222ba" exitCode=0 Jan 10 16:28:24 crc kubenswrapper[5036]: I0110 16:28:24.748298 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kmzz" event={"ID":"baa66345-69f5-4a8c-b0fd-c28f048c239b","Type":"ContainerDied","Data":"becaeb576370e92ff464675705295214f4bc8a6c21b1c8af1eef354e6f9222ba"} Jan 10 16:28:25 crc kubenswrapper[5036]: I0110 16:28:25.211106 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:25 crc kubenswrapper[5036]: E0110 16:28:25.211316 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:33.211288333 +0000 UTC m=+35.081523827 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:25 crc kubenswrapper[5036]: I0110 16:28:25.211374 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 10 16:28:25 crc kubenswrapper[5036]: I0110 16:28:25.211404 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 10 16:28:25 crc kubenswrapper[5036]: I0110 16:28:25.211425 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 10 16:28:25 crc kubenswrapper[5036]: E0110 16:28:25.211541 5036 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 10 16:28:25 crc kubenswrapper[5036]: E0110 16:28:25.211546 5036 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 10 16:28:25 crc kubenswrapper[5036]: E0110 16:28:25.211561 5036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 10 16:28:25 crc kubenswrapper[5036]: E0110 16:28:25.211578 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-10 16:28:33.21157152 +0000 UTC m=+35.081807014 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 10 16:28:25 crc kubenswrapper[5036]: E0110 16:28:25.211585 5036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 10 16:28:25 crc kubenswrapper[5036]: E0110 16:28:25.211603 5036 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 10 16:28:25 crc kubenswrapper[5036]: E0110 16:28:25.211606 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-10 16:28:33.211589551 +0000 UTC m=+35.081825055 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 10 16:28:25 crc kubenswrapper[5036]: E0110 16:28:25.211644 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-10 16:28:33.211631282 +0000 UTC m=+35.081866786 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 10 16:28:25 crc kubenswrapper[5036]: I0110 16:28:25.312549 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 10 16:28:25 crc kubenswrapper[5036]: E0110 16:28:25.312904 5036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 10 16:28:25 crc kubenswrapper[5036]: E0110 16:28:25.312989 5036 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 10 16:28:25 crc kubenswrapper[5036]: E0110 16:28:25.313022 5036 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 10 16:28:25 crc kubenswrapper[5036]: E0110 16:28:25.313179 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-10 16:28:33.313139908 +0000 UTC m=+35.183375542 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 10 16:28:25 crc kubenswrapper[5036]: I0110 16:28:25.507493 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzkzv" Jan 10 16:28:25 crc kubenswrapper[5036]: I0110 16:28:25.507572 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 10 16:28:25 crc kubenswrapper[5036]: E0110 16:28:25.508264 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzkzv" podUID="b4ede2a2-1cff-4d29-8b81-16de7162b5fe" Jan 10 16:28:25 crc kubenswrapper[5036]: I0110 16:28:25.507572 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 10 16:28:25 crc kubenswrapper[5036]: E0110 16:28:25.508445 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 10 16:28:25 crc kubenswrapper[5036]: E0110 16:28:25.508515 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 10 16:28:25 crc kubenswrapper[5036]: I0110 16:28:25.755838 5036 generic.go:334] "Generic (PLEG): container finished" podID="baa66345-69f5-4a8c-b0fd-c28f048c239b" containerID="7e6be326225f2bd5f7ac54127119fde695532dc20d82d9f6e888aeb5d5031b99" exitCode=0 Jan 10 16:28:25 crc kubenswrapper[5036]: I0110 16:28:25.755887 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kmzz" event={"ID":"baa66345-69f5-4a8c-b0fd-c28f048c239b","Type":"ContainerDied","Data":"7e6be326225f2bd5f7ac54127119fde695532dc20d82d9f6e888aeb5d5031b99"} Jan 10 16:28:25 crc kubenswrapper[5036]: I0110 16:28:25.782282 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" event={"ID":"98fa8c41-2298-4b26-849a-806cc77bcc40","Type":"ContainerStarted","Data":"e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f"} Jan 10 16:28:25 crc kubenswrapper[5036]: I0110 16:28:25.783279 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:25 crc kubenswrapper[5036]: I0110 16:28:25.783313 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:25 crc kubenswrapper[5036]: I0110 16:28:25.814795 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:25 crc kubenswrapper[5036]: I0110 16:28:25.826508 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:25 crc kubenswrapper[5036]: I0110 16:28:25.857142 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" podStartSLOduration=7.857115435 podStartE2EDuration="7.857115435s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:25.823575148 +0000 UTC m=+27.693810642" watchObservedRunningTime="2026-01-10 16:28:25.857115435 +0000 UTC m=+27.727350929" Jan 10 16:28:26 crc kubenswrapper[5036]: I0110 16:28:26.507511 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 10 16:28:26 crc kubenswrapper[5036]: E0110 16:28:26.507765 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 10 16:28:26 crc kubenswrapper[5036]: I0110 16:28:26.791004 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5kmzz" event={"ID":"baa66345-69f5-4a8c-b0fd-c28f048c239b","Type":"ContainerStarted","Data":"cbabceddb1b7b427ecfb87e7d8c1f8bae3222f4ed937958fe7026dffde335769"} Jan 10 16:28:26 crc kubenswrapper[5036]: I0110 16:28:26.791123 5036 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 10 16:28:26 crc kubenswrapper[5036]: I0110 16:28:26.821467 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5kmzz" podStartSLOduration=8.821446925 podStartE2EDuration="8.821446925s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:26.82050727 +0000 UTC m=+28.690742764" watchObservedRunningTime="2026-01-10 16:28:26.821446925 +0000 UTC m=+28.691682429" Jan 10 16:28:27 crc kubenswrapper[5036]: I0110 16:28:27.507102 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 10 16:28:27 crc kubenswrapper[5036]: E0110 16:28:27.507547 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 10 16:28:27 crc kubenswrapper[5036]: I0110 16:28:27.507198 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 10 16:28:27 crc kubenswrapper[5036]: I0110 16:28:27.507144 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzkzv" Jan 10 16:28:27 crc kubenswrapper[5036]: E0110 16:28:27.508189 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 10 16:28:27 crc kubenswrapper[5036]: E0110 16:28:27.508230 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzkzv" podUID="b4ede2a2-1cff-4d29-8b81-16de7162b5fe" Jan 10 16:28:27 crc kubenswrapper[5036]: I0110 16:28:27.604792 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lzkzv"] Jan 10 16:28:27 crc kubenswrapper[5036]: I0110 16:28:27.794788 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzkzv" Jan 10 16:28:27 crc kubenswrapper[5036]: E0110 16:28:27.794984 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzkzv" podUID="b4ede2a2-1cff-4d29-8b81-16de7162b5fe" Jan 10 16:28:27 crc kubenswrapper[5036]: I0110 16:28:27.795588 5036 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 10 16:28:28 crc kubenswrapper[5036]: I0110 16:28:28.044726 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4ede2a2-1cff-4d29-8b81-16de7162b5fe-metrics-certs\") pod \"network-metrics-daemon-lzkzv\" (UID: \"b4ede2a2-1cff-4d29-8b81-16de7162b5fe\") " pod="openshift-multus/network-metrics-daemon-lzkzv" Jan 10 16:28:28 crc kubenswrapper[5036]: E0110 16:28:28.044962 5036 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 10 16:28:28 crc kubenswrapper[5036]: E0110 16:28:28.045103 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4ede2a2-1cff-4d29-8b81-16de7162b5fe-metrics-certs podName:b4ede2a2-1cff-4d29-8b81-16de7162b5fe nodeName:}" failed. No retries permitted until 2026-01-10 16:28:36.045058708 +0000 UTC m=+37.915294212 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4ede2a2-1cff-4d29-8b81-16de7162b5fe-metrics-certs") pod "network-metrics-daemon-lzkzv" (UID: "b4ede2a2-1cff-4d29-8b81-16de7162b5fe") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 10 16:28:28 crc kubenswrapper[5036]: I0110 16:28:28.508053 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 10 16:28:28 crc kubenswrapper[5036]: E0110 16:28:28.510499 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 10 16:28:29 crc kubenswrapper[5036]: I0110 16:28:29.507905 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzkzv" Jan 10 16:28:29 crc kubenswrapper[5036]: I0110 16:28:29.507905 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 10 16:28:29 crc kubenswrapper[5036]: I0110 16:28:29.507961 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 10 16:28:29 crc kubenswrapper[5036]: E0110 16:28:29.508533 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lzkzv" podUID="b4ede2a2-1cff-4d29-8b81-16de7162b5fe" Jan 10 16:28:29 crc kubenswrapper[5036]: E0110 16:28:29.508601 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 10 16:28:29 crc kubenswrapper[5036]: E0110 16:28:29.508653 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 10 16:28:30 crc kubenswrapper[5036]: I0110 16:28:30.507930 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 10 16:28:30 crc kubenswrapper[5036]: E0110 16:28:30.508058 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.068155 5036 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.068356 5036 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.126131 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7lh8w"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.127622 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9r9hf"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.128397 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.129148 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.132073 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r5pns"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.132733 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r5pns" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.137695 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.141743 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.142205 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.142366 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-45j5v"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.143019 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.143479 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8vvs"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.144458 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-45j5v" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.145848 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.143492 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.144314 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.144839 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.144996 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.145206 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.145241 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.145289 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.145367 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.145407 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.145439 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.145478 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.145520 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.145557 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.145605 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.145781 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.149191 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.153345 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-lx6q9"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.153511 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.154831 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.155112 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-zkxb5"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.155180 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-lx6q9" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.156061 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vv7rp"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.156218 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zkxb5" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.157160 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vv7rp" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.163290 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.164343 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.164938 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.165351 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8dts"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.166119 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8dts" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.167895 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.168318 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.168436 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2rvph"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.169015 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2rvph" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.169267 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-v44cl"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.172087 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.172383 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.177186 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.178371 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-kcb5k"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.179222 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-kcb5k" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.182953 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.183501 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-v44cl" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.186923 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.187360 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.188050 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.198971 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.199738 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.200246 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.200557 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.201051 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.201517 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.188100 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.203599 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.203837 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.204933 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.205502 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.214257 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.214356 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.214644 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.214662 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.214874 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.215041 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.215788 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.215914 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.216026 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.216179 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.216243 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnq8l"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.216319 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.216332 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.216427 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.216634 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.219998 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.220222 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.220359 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.223294 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.224903 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-65jzk"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.224944 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.226263 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.226689 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.227040 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.227146 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.227154 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.227151 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.235989 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.236083 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.236208 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.236318 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.237951 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.237982 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a27702d-fd8a-4b89-883e-a2250c0cb1a9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-r5pns\" (UID: \"2a27702d-fd8a-4b89-883e-a2250c0cb1a9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r5pns" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238018 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e847b6c6-710f-4a76-9887-bac022f8de18-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vv7rp\" (UID: \"e847b6c6-710f-4a76-9887-bac022f8de18\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vv7rp" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238043 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f6db2aeb-98ec-4b01-83b0-a0dc2816bf48-auth-proxy-config\") pod \"machine-approver-56656f9798-zkxb5\" (UID: \"f6db2aeb-98ec-4b01-83b0-a0dc2816bf48\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zkxb5" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238067 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvnkg\" (UniqueName: \"kubernetes.io/projected/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581-kube-api-access-mvnkg\") pod \"route-controller-manager-6576b87f9c-b4wwd\" (UID: \"e5ea287e-5a20-4798-8f4b-4f2d0e5b1581\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238106 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77b7a0ba-113f-4f0a-a6c5-f5850de92916-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2rvph\" (UID: \"77b7a0ba-113f-4f0a-a6c5-f5850de92916\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2rvph" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238141 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-config\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238165 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6db2aeb-98ec-4b01-83b0-a0dc2816bf48-config\") pod \"machine-approver-56656f9798-zkxb5\" (UID: \"f6db2aeb-98ec-4b01-83b0-a0dc2816bf48\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zkxb5" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238189 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/76a88401-7e1f-4e2d-accb-184ff7867211-etcd-client\") pod \"apiserver-7bbb656c7d-dpl6f\" (UID: \"76a88401-7e1f-4e2d-accb-184ff7867211\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238210 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/253187aa-7581-4eb5-ab49-bc4d53a47810-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-s8dts\" (UID: \"253187aa-7581-4eb5-ab49-bc4d53a47810\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8dts" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238226 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238242 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05b7892c-5976-4209-821e-be876e2d43a1-serving-cert\") pod \"authentication-operator-69f744f599-lx6q9\" (UID: \"05b7892c-5976-4209-821e-be876e2d43a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lx6q9" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238262 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5620c8e3-4592-4189-b074-4ea40e9447ff-service-ca-bundle\") pod \"router-default-5444994796-kcb5k\" (UID: \"5620c8e3-4592-4189-b074-4ea40e9447ff\") " pod="openshift-ingress/router-default-5444994796-kcb5k" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238281 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/374ac022-3179-463e-a9b9-6c9890a8baea-metrics-tls\") pod \"dns-operator-744455d44c-v44cl\" (UID: \"374ac022-3179-463e-a9b9-6c9890a8baea\") " pod="openshift-dns-operator/dns-operator-744455d44c-v44cl" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238307 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-etcd-serving-ca\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238315 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238322 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5620c8e3-4592-4189-b074-4ea40e9447ff-default-certificate\") pod \"router-default-5444994796-kcb5k\" (UID: \"5620c8e3-4592-4189-b074-4ea40e9447ff\") " pod="openshift-ingress/router-default-5444994796-kcb5k" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238343 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwgfw\" (UniqueName: \"kubernetes.io/projected/5620c8e3-4592-4189-b074-4ea40e9447ff-kube-api-access-rwgfw\") pod \"router-default-5444994796-kcb5k\" (UID: \"5620c8e3-4592-4189-b074-4ea40e9447ff\") " pod="openshift-ingress/router-default-5444994796-kcb5k" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238363 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238385 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238412 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238425 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238456 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-audit-dir\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238472 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581-serving-cert\") pod \"route-controller-manager-6576b87f9c-b4wwd\" (UID: \"e5ea287e-5a20-4798-8f4b-4f2d0e5b1581\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238488 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238513 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-node-pullsecrets\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238557 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7h2w\" (UniqueName: \"kubernetes.io/projected/05b7892c-5976-4209-821e-be876e2d43a1-kube-api-access-v7h2w\") pod \"authentication-operator-69f744f599-lx6q9\" (UID: \"05b7892c-5976-4209-821e-be876e2d43a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lx6q9" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238573 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnq8l" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238577 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cxgw\" (UniqueName: \"kubernetes.io/projected/2a27702d-fd8a-4b89-883e-a2250c0cb1a9-kube-api-access-8cxgw\") pod \"openshift-apiserver-operator-796bbdcf4f-r5pns\" (UID: \"2a27702d-fd8a-4b89-883e-a2250c0cb1a9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r5pns" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238726 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/792beb3d-c532-4c80-8ab7-3024b5db8512-client-ca\") pod \"controller-manager-879f6c89f-9r9hf\" (UID: \"792beb3d-c532-4c80-8ab7-3024b5db8512\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238743 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/792beb3d-c532-4c80-8ab7-3024b5db8512-serving-cert\") pod \"controller-manager-879f6c89f-9r9hf\" (UID: \"792beb3d-c532-4c80-8ab7-3024b5db8512\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238758 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmdnp\" (UniqueName: \"kubernetes.io/projected/792beb3d-c532-4c80-8ab7-3024b5db8512-kube-api-access-pmdnp\") pod \"controller-manager-879f6c89f-9r9hf\" (UID: \"792beb3d-c532-4c80-8ab7-3024b5db8512\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238780 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-audit\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238797 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-serving-cert\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238816 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581-client-ca\") pod \"route-controller-manager-6576b87f9c-b4wwd\" (UID: \"e5ea287e-5a20-4798-8f4b-4f2d0e5b1581\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238833 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77b7a0ba-113f-4f0a-a6c5-f5850de92916-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2rvph\" (UID: \"77b7a0ba-113f-4f0a-a6c5-f5850de92916\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2rvph" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238854 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/253187aa-7581-4eb5-ab49-bc4d53a47810-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-s8dts\" (UID: \"253187aa-7581-4eb5-ab49-bc4d53a47810\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8dts" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238873 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238887 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05b7892c-5976-4209-821e-be876e2d43a1-config\") pod \"authentication-operator-69f744f599-lx6q9\" (UID: \"05b7892c-5976-4209-821e-be876e2d43a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lx6q9" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238911 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76a88401-7e1f-4e2d-accb-184ff7867211-serving-cert\") pod \"apiserver-7bbb656c7d-dpl6f\" (UID: \"76a88401-7e1f-4e2d-accb-184ff7867211\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238928 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238944 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5620c8e3-4592-4189-b074-4ea40e9447ff-metrics-certs\") pod \"router-default-5444994796-kcb5k\" (UID: \"5620c8e3-4592-4189-b074-4ea40e9447ff\") " pod="openshift-ingress/router-default-5444994796-kcb5k" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238979 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.238996 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239015 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl8kj\" (UniqueName: \"kubernetes.io/projected/253187aa-7581-4eb5-ab49-bc4d53a47810-kube-api-access-kl8kj\") pod \"cluster-image-registry-operator-dc59b4c8b-s8dts\" (UID: \"253187aa-7581-4eb5-ab49-bc4d53a47810\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8dts" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239032 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/87b4bb91-70e1-44be-83a9-7b6adced3e51-audit-policies\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239049 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srjn2\" (UniqueName: \"kubernetes.io/projected/76a88401-7e1f-4e2d-accb-184ff7867211-kube-api-access-srjn2\") pod \"apiserver-7bbb656c7d-dpl6f\" (UID: \"76a88401-7e1f-4e2d-accb-184ff7867211\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239065 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05b7892c-5976-4209-821e-be876e2d43a1-service-ca-bundle\") pod \"authentication-operator-69f744f599-lx6q9\" (UID: \"05b7892c-5976-4209-821e-be876e2d43a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lx6q9" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239081 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5620c8e3-4592-4189-b074-4ea40e9447ff-stats-auth\") pod \"router-default-5444994796-kcb5k\" (UID: \"5620c8e3-4592-4189-b074-4ea40e9447ff\") " pod="openshift-ingress/router-default-5444994796-kcb5k" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239095 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581-config\") pod \"route-controller-manager-6576b87f9c-b4wwd\" (UID: \"e5ea287e-5a20-4798-8f4b-4f2d0e5b1581\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239113 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dvcz\" (UniqueName: \"kubernetes.io/projected/77b7a0ba-113f-4f0a-a6c5-f5850de92916-kube-api-access-2dvcz\") pod \"openshift-controller-manager-operator-756b6f6bc6-2rvph\" (UID: \"77b7a0ba-113f-4f0a-a6c5-f5850de92916\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2rvph" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239136 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/253187aa-7581-4eb5-ab49-bc4d53a47810-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-s8dts\" (UID: \"253187aa-7581-4eb5-ab49-bc4d53a47810\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8dts" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239155 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/87b4bb91-70e1-44be-83a9-7b6adced3e51-audit-dir\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239179 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05b7892c-5976-4209-821e-be876e2d43a1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-lx6q9\" (UID: \"05b7892c-5976-4209-821e-be876e2d43a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lx6q9" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239196 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b14e5d5-1b40-45f6-a5c6-c161eeade0f9-config\") pod \"machine-api-operator-5694c8668f-45j5v\" (UID: \"6b14e5d5-1b40-45f6-a5c6-c161eeade0f9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-45j5v" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239205 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239220 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nlc7\" (UniqueName: \"kubernetes.io/projected/f6db2aeb-98ec-4b01-83b0-a0dc2816bf48-kube-api-access-7nlc7\") pod \"machine-approver-56656f9798-zkxb5\" (UID: \"f6db2aeb-98ec-4b01-83b0-a0dc2816bf48\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zkxb5" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239238 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239256 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-878vn\" (UniqueName: \"kubernetes.io/projected/87b4bb91-70e1-44be-83a9-7b6adced3e51-kube-api-access-878vn\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239276 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-encryption-config\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239294 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/76a88401-7e1f-4e2d-accb-184ff7867211-audit-policies\") pod \"apiserver-7bbb656c7d-dpl6f\" (UID: \"76a88401-7e1f-4e2d-accb-184ff7867211\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239310 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76a88401-7e1f-4e2d-accb-184ff7867211-audit-dir\") pod \"apiserver-7bbb656c7d-dpl6f\" (UID: \"76a88401-7e1f-4e2d-accb-184ff7867211\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239328 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m4gm\" (UniqueName: \"kubernetes.io/projected/374ac022-3179-463e-a9b9-6c9890a8baea-kube-api-access-2m4gm\") pod \"dns-operator-744455d44c-v44cl\" (UID: \"374ac022-3179-463e-a9b9-6c9890a8baea\") " pod="openshift-dns-operator/dns-operator-744455d44c-v44cl" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239392 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f6db2aeb-98ec-4b01-83b0-a0dc2816bf48-machine-approver-tls\") pod \"machine-approver-56656f9798-zkxb5\" (UID: \"f6db2aeb-98ec-4b01-83b0-a0dc2816bf48\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zkxb5" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239413 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6b14e5d5-1b40-45f6-a5c6-c161eeade0f9-images\") pod \"machine-api-operator-5694c8668f-45j5v\" (UID: \"6b14e5d5-1b40-45f6-a5c6-c161eeade0f9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-45j5v" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239440 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239456 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-image-import-ca\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239476 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/76a88401-7e1f-4e2d-accb-184ff7867211-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dpl6f\" (UID: \"76a88401-7e1f-4e2d-accb-184ff7867211\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239494 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sspn\" (UniqueName: \"kubernetes.io/projected/6b14e5d5-1b40-45f6-a5c6-c161eeade0f9-kube-api-access-7sspn\") pod \"machine-api-operator-5694c8668f-45j5v\" (UID: \"6b14e5d5-1b40-45f6-a5c6-c161eeade0f9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-45j5v" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239500 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239526 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-etcd-client\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239546 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239566 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmgqz\" (UniqueName: \"kubernetes.io/projected/e847b6c6-710f-4a76-9887-bac022f8de18-kube-api-access-vmgqz\") pod \"openshift-config-operator-7777fb866f-vv7rp\" (UID: \"e847b6c6-710f-4a76-9887-bac022f8de18\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vv7rp" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239605 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/792beb3d-c532-4c80-8ab7-3024b5db8512-config\") pod \"controller-manager-879f6c89f-9r9hf\" (UID: \"792beb3d-c532-4c80-8ab7-3024b5db8512\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239639 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239660 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/76a88401-7e1f-4e2d-accb-184ff7867211-encryption-config\") pod \"apiserver-7bbb656c7d-dpl6f\" (UID: \"76a88401-7e1f-4e2d-accb-184ff7867211\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239694 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b14e5d5-1b40-45f6-a5c6-c161eeade0f9-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-45j5v\" (UID: \"6b14e5d5-1b40-45f6-a5c6-c161eeade0f9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-45j5v" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239725 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a27702d-fd8a-4b89-883e-a2250c0cb1a9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-r5pns\" (UID: \"2a27702d-fd8a-4b89-883e-a2250c0cb1a9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r5pns" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239753 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnpq2\" (UniqueName: \"kubernetes.io/projected/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-kube-api-access-xnpq2\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239780 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e847b6c6-710f-4a76-9887-bac022f8de18-serving-cert\") pod \"openshift-config-operator-7777fb866f-vv7rp\" (UID: \"e847b6c6-710f-4a76-9887-bac022f8de18\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vv7rp" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239811 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76a88401-7e1f-4e2d-accb-184ff7867211-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dpl6f\" (UID: \"76a88401-7e1f-4e2d-accb-184ff7867211\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.239831 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/792beb3d-c532-4c80-8ab7-3024b5db8512-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9r9hf\" (UID: \"792beb3d-c532-4c80-8ab7-3024b5db8512\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.240090 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.240224 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.240292 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.241080 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.241159 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-65jzk" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.241399 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.241472 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.241534 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptztt"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.241643 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.241671 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.241774 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.241914 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.242300 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptztt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.242502 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rv7wh"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.243257 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rv7wh" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.251922 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.252439 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.252557 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.253425 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.253626 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.255044 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pjrh2"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.258360 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zs55l"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.260542 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjrh2" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.261134 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g9f5x"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.261227 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zs55l" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.263564 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdtsj"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.264001 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29467695-kv4q7"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.264116 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdtsj" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.264199 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g9f5x" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.264929 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8vgqk"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.265422 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-bvg6n"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.265866 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.266229 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29467695-kv4q7" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.266435 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8vgqk" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.267031 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pc2wp"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.267413 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-pc2wp" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.267476 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8k59b"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.267905 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.282558 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.301517 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.305175 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-z6hnf"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.312503 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.314501 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.315368 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-z6hnf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.316415 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.316546 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckc7q"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.316659 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.319300 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.319457 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.319916 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.320208 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.320529 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.323666 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckc7q" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.323889 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-dlncf"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.328313 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.329210 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.329335 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lr2qm"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.331229 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-dlncf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.331647 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lr2qm" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.331757 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mjcps"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.373245 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.374589 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.374813 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.374985 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.375780 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2wm7"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.376609 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581-serving-cert\") pod \"route-controller-manager-6576b87f9c-b4wwd\" (UID: \"e5ea287e-5a20-4798-8f4b-4f2d0e5b1581\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.376622 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2wm7" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.376816 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.377208 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-audit-dir\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.377358 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-audit-dir\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.377474 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-node-pullsecrets\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.377597 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7h2w\" (UniqueName: \"kubernetes.io/projected/05b7892c-5976-4209-821e-be876e2d43a1-kube-api-access-v7h2w\") pod \"authentication-operator-69f744f599-lx6q9\" (UID: \"05b7892c-5976-4209-821e-be876e2d43a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lx6q9" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.377710 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-node-pullsecrets\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.377885 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9r9hf"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378093 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/792beb3d-c532-4c80-8ab7-3024b5db8512-client-ca\") pod \"controller-manager-879f6c89f-9r9hf\" (UID: \"792beb3d-c532-4c80-8ab7-3024b5db8512\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378124 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/792beb3d-c532-4c80-8ab7-3024b5db8512-serving-cert\") pod \"controller-manager-879f6c89f-9r9hf\" (UID: \"792beb3d-c532-4c80-8ab7-3024b5db8512\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378188 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmdnp\" (UniqueName: \"kubernetes.io/projected/792beb3d-c532-4c80-8ab7-3024b5db8512-kube-api-access-pmdnp\") pod \"controller-manager-879f6c89f-9r9hf\" (UID: \"792beb3d-c532-4c80-8ab7-3024b5db8512\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378217 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cxgw\" (UniqueName: \"kubernetes.io/projected/2a27702d-fd8a-4b89-883e-a2250c0cb1a9-kube-api-access-8cxgw\") pod \"openshift-apiserver-operator-796bbdcf4f-r5pns\" (UID: \"2a27702d-fd8a-4b89-883e-a2250c0cb1a9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r5pns" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378240 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-audit\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378261 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-serving-cert\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378278 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581-client-ca\") pod \"route-controller-manager-6576b87f9c-b4wwd\" (UID: \"e5ea287e-5a20-4798-8f4b-4f2d0e5b1581\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378310 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/253187aa-7581-4eb5-ab49-bc4d53a47810-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-s8dts\" (UID: \"253187aa-7581-4eb5-ab49-bc4d53a47810\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8dts" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378344 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77b7a0ba-113f-4f0a-a6c5-f5850de92916-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2rvph\" (UID: \"77b7a0ba-113f-4f0a-a6c5-f5850de92916\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2rvph" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378365 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05b7892c-5976-4209-821e-be876e2d43a1-config\") pod \"authentication-operator-69f744f599-lx6q9\" (UID: \"05b7892c-5976-4209-821e-be876e2d43a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lx6q9" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378416 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378439 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76a88401-7e1f-4e2d-accb-184ff7867211-serving-cert\") pod \"apiserver-7bbb656c7d-dpl6f\" (UID: \"76a88401-7e1f-4e2d-accb-184ff7867211\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378455 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378471 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5620c8e3-4592-4189-b074-4ea40e9447ff-metrics-certs\") pod \"router-default-5444994796-kcb5k\" (UID: \"5620c8e3-4592-4189-b074-4ea40e9447ff\") " pod="openshift-ingress/router-default-5444994796-kcb5k" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378491 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378510 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378529 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl8kj\" (UniqueName: \"kubernetes.io/projected/253187aa-7581-4eb5-ab49-bc4d53a47810-kube-api-access-kl8kj\") pod \"cluster-image-registry-operator-dc59b4c8b-s8dts\" (UID: \"253187aa-7581-4eb5-ab49-bc4d53a47810\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8dts" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378548 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/87b4bb91-70e1-44be-83a9-7b6adced3e51-audit-policies\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378569 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srjn2\" (UniqueName: \"kubernetes.io/projected/76a88401-7e1f-4e2d-accb-184ff7867211-kube-api-access-srjn2\") pod \"apiserver-7bbb656c7d-dpl6f\" (UID: \"76a88401-7e1f-4e2d-accb-184ff7867211\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378586 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05b7892c-5976-4209-821e-be876e2d43a1-service-ca-bundle\") pod \"authentication-operator-69f744f599-lx6q9\" (UID: \"05b7892c-5976-4209-821e-be876e2d43a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lx6q9" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378632 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581-config\") pod \"route-controller-manager-6576b87f9c-b4wwd\" (UID: \"e5ea287e-5a20-4798-8f4b-4f2d0e5b1581\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378649 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5620c8e3-4592-4189-b074-4ea40e9447ff-stats-auth\") pod \"router-default-5444994796-kcb5k\" (UID: \"5620c8e3-4592-4189-b074-4ea40e9447ff\") " pod="openshift-ingress/router-default-5444994796-kcb5k" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378691 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dvcz\" (UniqueName: \"kubernetes.io/projected/77b7a0ba-113f-4f0a-a6c5-f5850de92916-kube-api-access-2dvcz\") pod \"openshift-controller-manager-operator-756b6f6bc6-2rvph\" (UID: \"77b7a0ba-113f-4f0a-a6c5-f5850de92916\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2rvph" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378730 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/253187aa-7581-4eb5-ab49-bc4d53a47810-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-s8dts\" (UID: \"253187aa-7581-4eb5-ab49-bc4d53a47810\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8dts" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378775 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/87b4bb91-70e1-44be-83a9-7b6adced3e51-audit-dir\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378800 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b14e5d5-1b40-45f6-a5c6-c161eeade0f9-config\") pod \"machine-api-operator-5694c8668f-45j5v\" (UID: \"6b14e5d5-1b40-45f6-a5c6-c161eeade0f9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-45j5v" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378823 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05b7892c-5976-4209-821e-be876e2d43a1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-lx6q9\" (UID: \"05b7892c-5976-4209-821e-be876e2d43a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lx6q9" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378843 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nlc7\" (UniqueName: \"kubernetes.io/projected/f6db2aeb-98ec-4b01-83b0-a0dc2816bf48-kube-api-access-7nlc7\") pod \"machine-approver-56656f9798-zkxb5\" (UID: \"f6db2aeb-98ec-4b01-83b0-a0dc2816bf48\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zkxb5" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378862 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.378879 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-878vn\" (UniqueName: \"kubernetes.io/projected/87b4bb91-70e1-44be-83a9-7b6adced3e51-kube-api-access-878vn\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.379853 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76a88401-7e1f-4e2d-accb-184ff7867211-audit-dir\") pod \"apiserver-7bbb656c7d-dpl6f\" (UID: \"76a88401-7e1f-4e2d-accb-184ff7867211\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.379882 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-encryption-config\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.379900 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/76a88401-7e1f-4e2d-accb-184ff7867211-audit-policies\") pod \"apiserver-7bbb656c7d-dpl6f\" (UID: \"76a88401-7e1f-4e2d-accb-184ff7867211\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.379924 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m4gm\" (UniqueName: \"kubernetes.io/projected/374ac022-3179-463e-a9b9-6c9890a8baea-kube-api-access-2m4gm\") pod \"dns-operator-744455d44c-v44cl\" (UID: \"374ac022-3179-463e-a9b9-6c9890a8baea\") " pod="openshift-dns-operator/dns-operator-744455d44c-v44cl" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.379944 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f6db2aeb-98ec-4b01-83b0-a0dc2816bf48-machine-approver-tls\") pod \"machine-approver-56656f9798-zkxb5\" (UID: \"f6db2aeb-98ec-4b01-83b0-a0dc2816bf48\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zkxb5" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.379960 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6b14e5d5-1b40-45f6-a5c6-c161eeade0f9-images\") pod \"machine-api-operator-5694c8668f-45j5v\" (UID: \"6b14e5d5-1b40-45f6-a5c6-c161eeade0f9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-45j5v" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.379985 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-image-import-ca\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380005 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/76a88401-7e1f-4e2d-accb-184ff7867211-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dpl6f\" (UID: \"76a88401-7e1f-4e2d-accb-184ff7867211\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380028 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sspn\" (UniqueName: \"kubernetes.io/projected/6b14e5d5-1b40-45f6-a5c6-c161eeade0f9-kube-api-access-7sspn\") pod \"machine-api-operator-5694c8668f-45j5v\" (UID: \"6b14e5d5-1b40-45f6-a5c6-c161eeade0f9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-45j5v" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380051 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-etcd-client\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380068 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380089 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmgqz\" (UniqueName: \"kubernetes.io/projected/e847b6c6-710f-4a76-9887-bac022f8de18-kube-api-access-vmgqz\") pod \"openshift-config-operator-7777fb866f-vv7rp\" (UID: \"e847b6c6-710f-4a76-9887-bac022f8de18\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vv7rp" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380117 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/792beb3d-c532-4c80-8ab7-3024b5db8512-config\") pod \"controller-manager-879f6c89f-9r9hf\" (UID: \"792beb3d-c532-4c80-8ab7-3024b5db8512\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380138 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380157 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/76a88401-7e1f-4e2d-accb-184ff7867211-encryption-config\") pod \"apiserver-7bbb656c7d-dpl6f\" (UID: \"76a88401-7e1f-4e2d-accb-184ff7867211\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380177 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b14e5d5-1b40-45f6-a5c6-c161eeade0f9-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-45j5v\" (UID: \"6b14e5d5-1b40-45f6-a5c6-c161eeade0f9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-45j5v" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380212 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a27702d-fd8a-4b89-883e-a2250c0cb1a9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-r5pns\" (UID: \"2a27702d-fd8a-4b89-883e-a2250c0cb1a9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r5pns" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380231 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e847b6c6-710f-4a76-9887-bac022f8de18-serving-cert\") pod \"openshift-config-operator-7777fb866f-vv7rp\" (UID: \"e847b6c6-710f-4a76-9887-bac022f8de18\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vv7rp" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380252 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnpq2\" (UniqueName: \"kubernetes.io/projected/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-kube-api-access-xnpq2\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380258 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05b7892c-5976-4209-821e-be876e2d43a1-service-ca-bundle\") pod \"authentication-operator-69f744f599-lx6q9\" (UID: \"05b7892c-5976-4209-821e-be876e2d43a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lx6q9" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380271 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76a88401-7e1f-4e2d-accb-184ff7867211-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dpl6f\" (UID: \"76a88401-7e1f-4e2d-accb-184ff7867211\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380293 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/792beb3d-c532-4c80-8ab7-3024b5db8512-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9r9hf\" (UID: \"792beb3d-c532-4c80-8ab7-3024b5db8512\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380291 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-km6m5"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380319 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a27702d-fd8a-4b89-883e-a2250c0cb1a9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-r5pns\" (UID: \"2a27702d-fd8a-4b89-883e-a2250c0cb1a9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r5pns" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380342 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e847b6c6-710f-4a76-9887-bac022f8de18-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vv7rp\" (UID: \"e847b6c6-710f-4a76-9887-bac022f8de18\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vv7rp" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380366 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77b7a0ba-113f-4f0a-a6c5-f5850de92916-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2rvph\" (UID: \"77b7a0ba-113f-4f0a-a6c5-f5850de92916\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2rvph" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380396 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f6db2aeb-98ec-4b01-83b0-a0dc2816bf48-auth-proxy-config\") pod \"machine-approver-56656f9798-zkxb5\" (UID: \"f6db2aeb-98ec-4b01-83b0-a0dc2816bf48\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zkxb5" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380417 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvnkg\" (UniqueName: \"kubernetes.io/projected/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581-kube-api-access-mvnkg\") pod \"route-controller-manager-6576b87f9c-b4wwd\" (UID: \"e5ea287e-5a20-4798-8f4b-4f2d0e5b1581\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380435 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/76a88401-7e1f-4e2d-accb-184ff7867211-etcd-client\") pod \"apiserver-7bbb656c7d-dpl6f\" (UID: \"76a88401-7e1f-4e2d-accb-184ff7867211\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380454 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/253187aa-7581-4eb5-ab49-bc4d53a47810-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-s8dts\" (UID: \"253187aa-7581-4eb5-ab49-bc4d53a47810\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8dts" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380473 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380499 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05b7892c-5976-4209-821e-be876e2d43a1-serving-cert\") pod \"authentication-operator-69f744f599-lx6q9\" (UID: \"05b7892c-5976-4209-821e-be876e2d43a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lx6q9" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380522 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-config\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380543 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6db2aeb-98ec-4b01-83b0-a0dc2816bf48-config\") pod \"machine-approver-56656f9798-zkxb5\" (UID: \"f6db2aeb-98ec-4b01-83b0-a0dc2816bf48\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zkxb5" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380564 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5620c8e3-4592-4189-b074-4ea40e9447ff-service-ca-bundle\") pod \"router-default-5444994796-kcb5k\" (UID: \"5620c8e3-4592-4189-b074-4ea40e9447ff\") " pod="openshift-ingress/router-default-5444994796-kcb5k" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380591 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/374ac022-3179-463e-a9b9-6c9890a8baea-metrics-tls\") pod \"dns-operator-744455d44c-v44cl\" (UID: \"374ac022-3179-463e-a9b9-6c9890a8baea\") " pod="openshift-dns-operator/dns-operator-744455d44c-v44cl" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380612 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-etcd-serving-ca\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380631 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5620c8e3-4592-4189-b074-4ea40e9447ff-default-certificate\") pod \"router-default-5444994796-kcb5k\" (UID: \"5620c8e3-4592-4189-b074-4ea40e9447ff\") " pod="openshift-ingress/router-default-5444994796-kcb5k" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380650 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwgfw\" (UniqueName: \"kubernetes.io/projected/5620c8e3-4592-4189-b074-4ea40e9447ff-kube-api-access-rwgfw\") pod \"router-default-5444994796-kcb5k\" (UID: \"5620c8e3-4592-4189-b074-4ea40e9447ff\") " pod="openshift-ingress/router-default-5444994796-kcb5k" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380735 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380761 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380782 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.380786 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/87b4bb91-70e1-44be-83a9-7b6adced3e51-audit-dir\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.381314 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581-config\") pod \"route-controller-manager-6576b87f9c-b4wwd\" (UID: \"e5ea287e-5a20-4798-8f4b-4f2d0e5b1581\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.381330 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/253187aa-7581-4eb5-ab49-bc4d53a47810-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-s8dts\" (UID: \"253187aa-7581-4eb5-ab49-bc4d53a47810\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8dts" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.382005 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.379081 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/792beb3d-c532-4c80-8ab7-3024b5db8512-client-ca\") pod \"controller-manager-879f6c89f-9r9hf\" (UID: \"792beb3d-c532-4c80-8ab7-3024b5db8512\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.382075 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/76a88401-7e1f-4e2d-accb-184ff7867211-audit-dir\") pod \"apiserver-7bbb656c7d-dpl6f\" (UID: \"76a88401-7e1f-4e2d-accb-184ff7867211\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.379320 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.382520 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-audit\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.383057 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05b7892c-5976-4209-821e-be876e2d43a1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-lx6q9\" (UID: \"05b7892c-5976-4209-821e-be876e2d43a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lx6q9" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.383209 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.383287 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581-client-ca\") pod \"route-controller-manager-6576b87f9c-b4wwd\" (UID: \"e5ea287e-5a20-4798-8f4b-4f2d0e5b1581\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.383467 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.383610 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77b7a0ba-113f-4f0a-a6c5-f5850de92916-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2rvph\" (UID: \"77b7a0ba-113f-4f0a-a6c5-f5850de92916\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2rvph" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.384509 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/792beb3d-c532-4c80-8ab7-3024b5db8512-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9r9hf\" (UID: \"792beb3d-c532-4c80-8ab7-3024b5db8512\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.379694 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/87b4bb91-70e1-44be-83a9-7b6adced3e51-audit-policies\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.385127 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/76a88401-7e1f-4e2d-accb-184ff7867211-audit-policies\") pod \"apiserver-7bbb656c7d-dpl6f\" (UID: \"76a88401-7e1f-4e2d-accb-184ff7867211\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.385312 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-config\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.385398 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-serving-cert\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.385800 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6db2aeb-98ec-4b01-83b0-a0dc2816bf48-config\") pod \"machine-approver-56656f9798-zkxb5\" (UID: \"f6db2aeb-98ec-4b01-83b0-a0dc2816bf48\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zkxb5" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.385935 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bcpmt"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.386526 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jj9tv"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.386533 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5620c8e3-4592-4189-b074-4ea40e9447ff-service-ca-bundle\") pod \"router-default-5444994796-kcb5k\" (UID: \"5620c8e3-4592-4189-b074-4ea40e9447ff\") " pod="openshift-ingress/router-default-5444994796-kcb5k" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.387095 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/792beb3d-c532-4c80-8ab7-3024b5db8512-config\") pod \"controller-manager-879f6c89f-9r9hf\" (UID: \"792beb3d-c532-4c80-8ab7-3024b5db8512\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.387341 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a27702d-fd8a-4b89-883e-a2250c0cb1a9-config\") pod \"openshift-apiserver-operator-796bbdcf4f-r5pns\" (UID: \"2a27702d-fd8a-4b89-883e-a2250c0cb1a9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r5pns" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.387416 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b14e5d5-1b40-45f6-a5c6-c161eeade0f9-config\") pod \"machine-api-operator-5694c8668f-45j5v\" (UID: \"6b14e5d5-1b40-45f6-a5c6-c161eeade0f9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-45j5v" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.387469 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jj9tv" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.387759 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e847b6c6-710f-4a76-9887-bac022f8de18-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vv7rp\" (UID: \"e847b6c6-710f-4a76-9887-bac022f8de18\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vv7rp" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.387781 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-km6m5" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.387938 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bcpmt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.388059 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581-serving-cert\") pod \"route-controller-manager-6576b87f9c-b4wwd\" (UID: \"e5ea287e-5a20-4798-8f4b-4f2d0e5b1581\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.388135 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-image-import-ca\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.388293 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77b7a0ba-113f-4f0a-a6c5-f5850de92916-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2rvph\" (UID: \"77b7a0ba-113f-4f0a-a6c5-f5850de92916\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2rvph" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.388615 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05b7892c-5976-4209-821e-be876e2d43a1-config\") pod \"authentication-operator-69f744f599-lx6q9\" (UID: \"05b7892c-5976-4209-821e-be876e2d43a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lx6q9" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.388848 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/76a88401-7e1f-4e2d-accb-184ff7867211-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dpl6f\" (UID: \"76a88401-7e1f-4e2d-accb-184ff7867211\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.388898 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6b14e5d5-1b40-45f6-a5c6-c161eeade0f9-images\") pod \"machine-api-operator-5694c8668f-45j5v\" (UID: \"6b14e5d5-1b40-45f6-a5c6-c161eeade0f9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-45j5v" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.388925 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f6db2aeb-98ec-4b01-83b0-a0dc2816bf48-auth-proxy-config\") pod \"machine-approver-56656f9798-zkxb5\" (UID: \"f6db2aeb-98ec-4b01-83b0-a0dc2816bf48\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zkxb5" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.389241 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.389906 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.389994 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/374ac022-3179-463e-a9b9-6c9890a8baea-metrics-tls\") pod \"dns-operator-744455d44c-v44cl\" (UID: \"374ac022-3179-463e-a9b9-6c9890a8baea\") " pod="openshift-dns-operator/dns-operator-744455d44c-v44cl" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.390079 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-etcd-serving-ca\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.390443 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.390568 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.390465 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/253187aa-7581-4eb5-ab49-bc4d53a47810-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-s8dts\" (UID: \"253187aa-7581-4eb5-ab49-bc4d53a47810\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8dts" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.391433 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76a88401-7e1f-4e2d-accb-184ff7867211-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dpl6f\" (UID: \"76a88401-7e1f-4e2d-accb-184ff7867211\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.391478 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.391614 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-encryption-config\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.392245 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.392407 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5620c8e3-4592-4189-b074-4ea40e9447ff-stats-auth\") pod \"router-default-5444994796-kcb5k\" (UID: \"5620c8e3-4592-4189-b074-4ea40e9447ff\") " pod="openshift-ingress/router-default-5444994796-kcb5k" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.392642 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76a88401-7e1f-4e2d-accb-184ff7867211-serving-cert\") pod \"apiserver-7bbb656c7d-dpl6f\" (UID: \"76a88401-7e1f-4e2d-accb-184ff7867211\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.392753 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.392912 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.392924 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-etcd-client\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.393650 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5620c8e3-4592-4189-b074-4ea40e9447ff-metrics-certs\") pod \"router-default-5444994796-kcb5k\" (UID: \"5620c8e3-4592-4189-b074-4ea40e9447ff\") " pod="openshift-ingress/router-default-5444994796-kcb5k" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.393717 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/76a88401-7e1f-4e2d-accb-184ff7867211-etcd-client\") pod \"apiserver-7bbb656c7d-dpl6f\" (UID: \"76a88401-7e1f-4e2d-accb-184ff7867211\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.394339 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.394427 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f6db2aeb-98ec-4b01-83b0-a0dc2816bf48-machine-approver-tls\") pod \"machine-approver-56656f9798-zkxb5\" (UID: \"f6db2aeb-98ec-4b01-83b0-a0dc2816bf48\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zkxb5" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.394502 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5hjq7"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.394609 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05b7892c-5976-4209-821e-be876e2d43a1-serving-cert\") pod \"authentication-operator-69f744f599-lx6q9\" (UID: \"05b7892c-5976-4209-821e-be876e2d43a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lx6q9" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.394688 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5620c8e3-4592-4189-b074-4ea40e9447ff-default-certificate\") pod \"router-default-5444994796-kcb5k\" (UID: \"5620c8e3-4592-4189-b074-4ea40e9447ff\") " pod="openshift-ingress/router-default-5444994796-kcb5k" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.395556 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-5hjq7" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.395903 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b14e5d5-1b40-45f6-a5c6-c161eeade0f9-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-45j5v\" (UID: \"6b14e5d5-1b40-45f6-a5c6-c161eeade0f9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-45j5v" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.396328 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/76a88401-7e1f-4e2d-accb-184ff7867211-encryption-config\") pod \"apiserver-7bbb656c7d-dpl6f\" (UID: \"76a88401-7e1f-4e2d-accb-184ff7867211\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.397348 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e847b6c6-710f-4a76-9887-bac022f8de18-serving-cert\") pod \"openshift-config-operator-7777fb866f-vv7rp\" (UID: \"e847b6c6-710f-4a76-9887-bac022f8de18\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vv7rp" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.397400 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-lt5rc"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.398038 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a27702d-fd8a-4b89-883e-a2250c0cb1a9-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-r5pns\" (UID: \"2a27702d-fd8a-4b89-883e-a2250c0cb1a9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r5pns" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.398204 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.398418 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-2kjbd"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.398691 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.398955 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kjbd" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.398993 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.399750 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7lh8w"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.400487 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.400973 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-45j5v"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.401901 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r5pns"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.404212 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-lc5jj"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.405506 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.406044 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.406056 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/792beb3d-c532-4c80-8ab7-3024b5db8512-serving-cert\") pod \"controller-manager-879f6c89f-9r9hf\" (UID: \"792beb3d-c532-4c80-8ab7-3024b5db8512\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.407258 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2rvph"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.408175 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-lx6q9"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.409319 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-v44cl"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.411034 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.412800 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-65jzk"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.414033 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rv7wh"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.415782 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdtsj"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.417325 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8vvs"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.430882 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g9f5x"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.432328 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zs55l"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.432883 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.438310 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.439667 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnq8l"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.440858 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vv7rp"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.441842 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckc7q"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.443133 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-c8nxz"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.443929 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-c8nxz" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.444539 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8dts"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.445904 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lr2qm"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.447470 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4qfgz"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.448279 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4qfgz" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.448566 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pjrh2"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.449779 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-dlncf"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.451261 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29467695-kv4q7"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.452787 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-w6mpm"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.453743 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w6mpm" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.453944 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptztt"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.454224 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.456186 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-lc5jj"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.457796 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-z6hnf"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.458249 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pc2wp"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.459200 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bcpmt"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.460335 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-km6m5"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.461525 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-bvg6n"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.462570 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5hjq7"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.463606 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2wm7"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.464577 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mjcps"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.465701 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8vgqk"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.466634 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8k59b"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.468694 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4qfgz"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.470363 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jj9tv"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.471837 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-2kjbd"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.472892 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w6mpm"] Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.473484 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.482070 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/515d0795-5463-4e54-b0d2-ee5b16994fa4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-z6hnf\" (UID: \"515d0795-5463-4e54-b0d2-ee5b16994fa4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-z6hnf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.482164 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq7wp\" (UniqueName: \"kubernetes.io/projected/a1b7d7be-9cb2-4817-89a0-ae511aa199ea-kube-api-access-rq7wp\") pod \"packageserver-d55dfcdfc-zdtsj\" (UID: \"a1b7d7be-9cb2-4817-89a0-ae511aa199ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdtsj" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.482211 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/acb54813-4c4d-4b94-9337-19541ac1980e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ptztt\" (UID: \"acb54813-4c4d-4b94-9337-19541ac1980e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptztt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.482236 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a1b7d7be-9cb2-4817-89a0-ae511aa199ea-tmpfs\") pod \"packageserver-d55dfcdfc-zdtsj\" (UID: \"a1b7d7be-9cb2-4817-89a0-ae511aa199ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdtsj" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.482259 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvdc6\" (UniqueName: \"kubernetes.io/projected/09ff12a0-dcd3-465b-a051-1f0216f9ba57-kube-api-access-pvdc6\") pod \"cluster-samples-operator-665b6dd947-fnq8l\" (UID: \"09ff12a0-dcd3-465b-a051-1f0216f9ba57\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnq8l" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.482281 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/88184b8f-9aed-4978-bfbb-7054dd96550e-metrics-tls\") pod \"ingress-operator-5b745b69d9-8vgqk\" (UID: \"88184b8f-9aed-4978-bfbb-7054dd96550e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8vgqk" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.482299 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jrh9\" (UniqueName: \"kubernetes.io/projected/51291205-9eaf-455b-aa3b-a261761c8c06-kube-api-access-4jrh9\") pod \"machine-config-controller-84d6567774-pjrh2\" (UID: \"51291205-9eaf-455b-aa3b-a261761c8c06\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjrh2" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.482337 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/615043bb-4f5a-497c-9d23-4ef7fe1b7ac8-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rv7wh\" (UID: \"615043bb-4f5a-497c-9d23-4ef7fe1b7ac8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rv7wh" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.482365 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a1b7d7be-9cb2-4817-89a0-ae511aa199ea-apiservice-cert\") pod \"packageserver-d55dfcdfc-zdtsj\" (UID: \"a1b7d7be-9cb2-4817-89a0-ae511aa199ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdtsj" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.482393 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x6vb\" (UniqueName: \"kubernetes.io/projected/acb54813-4c4d-4b94-9337-19541ac1980e-kube-api-access-7x6vb\") pod \"control-plane-machine-set-operator-78cbb6b69f-ptztt\" (UID: \"acb54813-4c4d-4b94-9337-19541ac1980e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptztt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.482418 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/51291205-9eaf-455b-aa3b-a261761c8c06-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pjrh2\" (UID: \"51291205-9eaf-455b-aa3b-a261761c8c06\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjrh2" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.482601 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/88184b8f-9aed-4978-bfbb-7054dd96550e-trusted-ca\") pod \"ingress-operator-5b745b69d9-8vgqk\" (UID: \"88184b8f-9aed-4978-bfbb-7054dd96550e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8vgqk" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.482640 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ef797e07-14de-4b71-af82-bd8304e658dc-signing-key\") pod \"service-ca-9c57cc56f-dlncf\" (UID: \"ef797e07-14de-4b71-af82-bd8304e658dc\") " pod="openshift-service-ca/service-ca-9c57cc56f-dlncf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.482745 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/09ff12a0-dcd3-465b-a051-1f0216f9ba57-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fnq8l\" (UID: \"09ff12a0-dcd3-465b-a051-1f0216f9ba57\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnq8l" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.482779 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1b7d7be-9cb2-4817-89a0-ae511aa199ea-webhook-cert\") pod \"packageserver-d55dfcdfc-zdtsj\" (UID: \"a1b7d7be-9cb2-4817-89a0-ae511aa199ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdtsj" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.482866 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/615043bb-4f5a-497c-9d23-4ef7fe1b7ac8-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rv7wh\" (UID: \"615043bb-4f5a-497c-9d23-4ef7fe1b7ac8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rv7wh" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.482908 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbmlj\" (UniqueName: \"kubernetes.io/projected/515d0795-5463-4e54-b0d2-ee5b16994fa4-kube-api-access-gbmlj\") pod \"multus-admission-controller-857f4d67dd-z6hnf\" (UID: \"515d0795-5463-4e54-b0d2-ee5b16994fa4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-z6hnf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.482965 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b6jc\" (UniqueName: \"kubernetes.io/projected/88184b8f-9aed-4978-bfbb-7054dd96550e-kube-api-access-7b6jc\") pod \"ingress-operator-5b745b69d9-8vgqk\" (UID: \"88184b8f-9aed-4978-bfbb-7054dd96550e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8vgqk" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.483011 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/615043bb-4f5a-497c-9d23-4ef7fe1b7ac8-config\") pod \"kube-controller-manager-operator-78b949d7b-rv7wh\" (UID: \"615043bb-4f5a-497c-9d23-4ef7fe1b7ac8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rv7wh" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.483145 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/51291205-9eaf-455b-aa3b-a261761c8c06-proxy-tls\") pod \"machine-config-controller-84d6567774-pjrh2\" (UID: \"51291205-9eaf-455b-aa3b-a261761c8c06\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjrh2" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.483254 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88184b8f-9aed-4978-bfbb-7054dd96550e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8vgqk\" (UID: \"88184b8f-9aed-4978-bfbb-7054dd96550e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8vgqk" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.483287 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjg26\" (UniqueName: \"kubernetes.io/projected/ef797e07-14de-4b71-af82-bd8304e658dc-kube-api-access-hjg26\") pod \"service-ca-9c57cc56f-dlncf\" (UID: \"ef797e07-14de-4b71-af82-bd8304e658dc\") " pod="openshift-service-ca/service-ca-9c57cc56f-dlncf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.483322 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ef797e07-14de-4b71-af82-bd8304e658dc-signing-cabundle\") pod \"service-ca-9c57cc56f-dlncf\" (UID: \"ef797e07-14de-4b71-af82-bd8304e658dc\") " pod="openshift-service-ca/service-ca-9c57cc56f-dlncf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.491984 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.507068 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.507073 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzkzv" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.507073 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.511806 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.531209 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.551295 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.573330 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.584491 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/88184b8f-9aed-4978-bfbb-7054dd96550e-trusted-ca\") pod \"ingress-operator-5b745b69d9-8vgqk\" (UID: \"88184b8f-9aed-4978-bfbb-7054dd96550e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8vgqk" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.584559 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ef797e07-14de-4b71-af82-bd8304e658dc-signing-key\") pod \"service-ca-9c57cc56f-dlncf\" (UID: \"ef797e07-14de-4b71-af82-bd8304e658dc\") " pod="openshift-service-ca/service-ca-9c57cc56f-dlncf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.584667 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/09ff12a0-dcd3-465b-a051-1f0216f9ba57-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fnq8l\" (UID: \"09ff12a0-dcd3-465b-a051-1f0216f9ba57\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnq8l" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.584753 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1b7d7be-9cb2-4817-89a0-ae511aa199ea-webhook-cert\") pod \"packageserver-d55dfcdfc-zdtsj\" (UID: \"a1b7d7be-9cb2-4817-89a0-ae511aa199ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdtsj" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.584830 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/615043bb-4f5a-497c-9d23-4ef7fe1b7ac8-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rv7wh\" (UID: \"615043bb-4f5a-497c-9d23-4ef7fe1b7ac8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rv7wh" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.584873 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbmlj\" (UniqueName: \"kubernetes.io/projected/515d0795-5463-4e54-b0d2-ee5b16994fa4-kube-api-access-gbmlj\") pod \"multus-admission-controller-857f4d67dd-z6hnf\" (UID: \"515d0795-5463-4e54-b0d2-ee5b16994fa4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-z6hnf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.584921 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b6jc\" (UniqueName: \"kubernetes.io/projected/88184b8f-9aed-4978-bfbb-7054dd96550e-kube-api-access-7b6jc\") pod \"ingress-operator-5b745b69d9-8vgqk\" (UID: \"88184b8f-9aed-4978-bfbb-7054dd96550e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8vgqk" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.584970 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/615043bb-4f5a-497c-9d23-4ef7fe1b7ac8-config\") pod \"kube-controller-manager-operator-78b949d7b-rv7wh\" (UID: \"615043bb-4f5a-497c-9d23-4ef7fe1b7ac8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rv7wh" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.585025 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/51291205-9eaf-455b-aa3b-a261761c8c06-proxy-tls\") pod \"machine-config-controller-84d6567774-pjrh2\" (UID: \"51291205-9eaf-455b-aa3b-a261761c8c06\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjrh2" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.585087 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88184b8f-9aed-4978-bfbb-7054dd96550e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8vgqk\" (UID: \"88184b8f-9aed-4978-bfbb-7054dd96550e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8vgqk" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.585125 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ef797e07-14de-4b71-af82-bd8304e658dc-signing-cabundle\") pod \"service-ca-9c57cc56f-dlncf\" (UID: \"ef797e07-14de-4b71-af82-bd8304e658dc\") " pod="openshift-service-ca/service-ca-9c57cc56f-dlncf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.585162 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjg26\" (UniqueName: \"kubernetes.io/projected/ef797e07-14de-4b71-af82-bd8304e658dc-kube-api-access-hjg26\") pod \"service-ca-9c57cc56f-dlncf\" (UID: \"ef797e07-14de-4b71-af82-bd8304e658dc\") " pod="openshift-service-ca/service-ca-9c57cc56f-dlncf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.585216 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/515d0795-5463-4e54-b0d2-ee5b16994fa4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-z6hnf\" (UID: \"515d0795-5463-4e54-b0d2-ee5b16994fa4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-z6hnf" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.585333 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq7wp\" (UniqueName: \"kubernetes.io/projected/a1b7d7be-9cb2-4817-89a0-ae511aa199ea-kube-api-access-rq7wp\") pod \"packageserver-d55dfcdfc-zdtsj\" (UID: \"a1b7d7be-9cb2-4817-89a0-ae511aa199ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdtsj" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.585397 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/acb54813-4c4d-4b94-9337-19541ac1980e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ptztt\" (UID: \"acb54813-4c4d-4b94-9337-19541ac1980e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptztt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.585441 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a1b7d7be-9cb2-4817-89a0-ae511aa199ea-tmpfs\") pod \"packageserver-d55dfcdfc-zdtsj\" (UID: \"a1b7d7be-9cb2-4817-89a0-ae511aa199ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdtsj" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.585496 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvdc6\" (UniqueName: \"kubernetes.io/projected/09ff12a0-dcd3-465b-a051-1f0216f9ba57-kube-api-access-pvdc6\") pod \"cluster-samples-operator-665b6dd947-fnq8l\" (UID: \"09ff12a0-dcd3-465b-a051-1f0216f9ba57\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnq8l" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.585547 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jrh9\" (UniqueName: \"kubernetes.io/projected/51291205-9eaf-455b-aa3b-a261761c8c06-kube-api-access-4jrh9\") pod \"machine-config-controller-84d6567774-pjrh2\" (UID: \"51291205-9eaf-455b-aa3b-a261761c8c06\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjrh2" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.585616 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/88184b8f-9aed-4978-bfbb-7054dd96550e-metrics-tls\") pod \"ingress-operator-5b745b69d9-8vgqk\" (UID: \"88184b8f-9aed-4978-bfbb-7054dd96550e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8vgqk" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.585672 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/615043bb-4f5a-497c-9d23-4ef7fe1b7ac8-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rv7wh\" (UID: \"615043bb-4f5a-497c-9d23-4ef7fe1b7ac8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rv7wh" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.585755 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a1b7d7be-9cb2-4817-89a0-ae511aa199ea-apiservice-cert\") pod \"packageserver-d55dfcdfc-zdtsj\" (UID: \"a1b7d7be-9cb2-4817-89a0-ae511aa199ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdtsj" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.585936 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x6vb\" (UniqueName: \"kubernetes.io/projected/acb54813-4c4d-4b94-9337-19541ac1980e-kube-api-access-7x6vb\") pod \"control-plane-machine-set-operator-78cbb6b69f-ptztt\" (UID: \"acb54813-4c4d-4b94-9337-19541ac1980e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptztt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.586061 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/51291205-9eaf-455b-aa3b-a261761c8c06-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pjrh2\" (UID: \"51291205-9eaf-455b-aa3b-a261761c8c06\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjrh2" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.587543 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/615043bb-4f5a-497c-9d23-4ef7fe1b7ac8-config\") pod \"kube-controller-manager-operator-78b949d7b-rv7wh\" (UID: \"615043bb-4f5a-497c-9d23-4ef7fe1b7ac8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rv7wh" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.587844 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a1b7d7be-9cb2-4817-89a0-ae511aa199ea-tmpfs\") pod \"packageserver-d55dfcdfc-zdtsj\" (UID: \"a1b7d7be-9cb2-4817-89a0-ae511aa199ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdtsj" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.589674 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/51291205-9eaf-455b-aa3b-a261761c8c06-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pjrh2\" (UID: \"51291205-9eaf-455b-aa3b-a261761c8c06\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjrh2" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.590498 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/09ff12a0-dcd3-465b-a051-1f0216f9ba57-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fnq8l\" (UID: \"09ff12a0-dcd3-465b-a051-1f0216f9ba57\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnq8l" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.590789 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/51291205-9eaf-455b-aa3b-a261761c8c06-proxy-tls\") pod \"machine-config-controller-84d6567774-pjrh2\" (UID: \"51291205-9eaf-455b-aa3b-a261761c8c06\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjrh2" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.590929 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/615043bb-4f5a-497c-9d23-4ef7fe1b7ac8-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rv7wh\" (UID: \"615043bb-4f5a-497c-9d23-4ef7fe1b7ac8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rv7wh" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.592139 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.592342 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/acb54813-4c4d-4b94-9337-19541ac1980e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ptztt\" (UID: \"acb54813-4c4d-4b94-9337-19541ac1980e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptztt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.595194 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1b7d7be-9cb2-4817-89a0-ae511aa199ea-webhook-cert\") pod \"packageserver-d55dfcdfc-zdtsj\" (UID: \"a1b7d7be-9cb2-4817-89a0-ae511aa199ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdtsj" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.597224 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a1b7d7be-9cb2-4817-89a0-ae511aa199ea-apiservice-cert\") pod \"packageserver-d55dfcdfc-zdtsj\" (UID: \"a1b7d7be-9cb2-4817-89a0-ae511aa199ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdtsj" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.611357 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.631407 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.651527 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.672550 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.691575 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.720053 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.732337 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.752181 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.771419 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.791396 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.811831 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.832516 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.841155 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/88184b8f-9aed-4978-bfbb-7054dd96550e-metrics-tls\") pod \"ingress-operator-5b745b69d9-8vgqk\" (UID: \"88184b8f-9aed-4978-bfbb-7054dd96550e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8vgqk" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.851665 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.878879 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.891537 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/88184b8f-9aed-4978-bfbb-7054dd96550e-trusted-ca\") pod \"ingress-operator-5b745b69d9-8vgqk\" (UID: \"88184b8f-9aed-4978-bfbb-7054dd96550e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8vgqk" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.892042 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.912664 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.931809 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.951911 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 10 16:28:31 crc kubenswrapper[5036]: I0110 16:28:31.979343 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.001613 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.013524 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.032976 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.066950 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.071798 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.092060 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.119627 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.131605 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.151939 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.172000 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.191753 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.198856 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.202240 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/515d0795-5463-4e54-b0d2-ee5b16994fa4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-z6hnf\" (UID: \"515d0795-5463-4e54-b0d2-ee5b16994fa4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-z6hnf" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.212761 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.251978 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.272999 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.292394 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.312722 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.332140 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.339052 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ef797e07-14de-4b71-af82-bd8304e658dc-signing-key\") pod \"service-ca-9c57cc56f-dlncf\" (UID: \"ef797e07-14de-4b71-af82-bd8304e658dc\") " pod="openshift-service-ca/service-ca-9c57cc56f-dlncf" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.351468 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.372306 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.389450 5036 request.go:700] Waited for 1.015681428s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dpackage-server-manager-serving-cert&limit=500&resourceVersion=0 Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.392509 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.411642 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.418008 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ef797e07-14de-4b71-af82-bd8304e658dc-signing-cabundle\") pod \"service-ca-9c57cc56f-dlncf\" (UID: \"ef797e07-14de-4b71-af82-bd8304e658dc\") " pod="openshift-service-ca/service-ca-9c57cc56f-dlncf" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.432428 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.451246 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.471435 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.491616 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.507394 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.528030 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7h2w\" (UniqueName: \"kubernetes.io/projected/05b7892c-5976-4209-821e-be876e2d43a1-kube-api-access-v7h2w\") pod \"authentication-operator-69f744f599-lx6q9\" (UID: \"05b7892c-5976-4209-821e-be876e2d43a1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-lx6q9" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.571040 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dvcz\" (UniqueName: \"kubernetes.io/projected/77b7a0ba-113f-4f0a-a6c5-f5850de92916-kube-api-access-2dvcz\") pod \"openshift-controller-manager-operator-756b6f6bc6-2rvph\" (UID: \"77b7a0ba-113f-4f0a-a6c5-f5850de92916\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2rvph" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.592823 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl8kj\" (UniqueName: \"kubernetes.io/projected/253187aa-7581-4eb5-ab49-bc4d53a47810-kube-api-access-kl8kj\") pod \"cluster-image-registry-operator-dc59b4c8b-s8dts\" (UID: \"253187aa-7581-4eb5-ab49-bc4d53a47810\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8dts" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.606385 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-878vn\" (UniqueName: \"kubernetes.io/projected/87b4bb91-70e1-44be-83a9-7b6adced3e51-kube-api-access-878vn\") pod \"oauth-openshift-558db77b4-c8vvs\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.627198 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srjn2\" (UniqueName: \"kubernetes.io/projected/76a88401-7e1f-4e2d-accb-184ff7867211-kube-api-access-srjn2\") pod \"apiserver-7bbb656c7d-dpl6f\" (UID: \"76a88401-7e1f-4e2d-accb-184ff7867211\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.687973 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nlc7\" (UniqueName: \"kubernetes.io/projected/f6db2aeb-98ec-4b01-83b0-a0dc2816bf48-kube-api-access-7nlc7\") pod \"machine-approver-56656f9798-zkxb5\" (UID: \"f6db2aeb-98ec-4b01-83b0-a0dc2816bf48\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zkxb5" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.711092 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.715475 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m4gm\" (UniqueName: \"kubernetes.io/projected/374ac022-3179-463e-a9b9-6c9890a8baea-kube-api-access-2m4gm\") pod \"dns-operator-744455d44c-v44cl\" (UID: \"374ac022-3179-463e-a9b9-6c9890a8baea\") " pod="openshift-dns-operator/dns-operator-744455d44c-v44cl" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.732418 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.750778 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.750795 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2rvph" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.772643 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.781825 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.791991 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.805995 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.815120 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-lx6q9" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.815175 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.829293 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zkxb5" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.832364 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.851944 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.872267 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.907193 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sspn\" (UniqueName: \"kubernetes.io/projected/6b14e5d5-1b40-45f6-a5c6-c161eeade0f9-kube-api-access-7sspn\") pod \"machine-api-operator-5694c8668f-45j5v\" (UID: \"6b14e5d5-1b40-45f6-a5c6-c161eeade0f9\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-45j5v" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.937940 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvnkg\" (UniqueName: \"kubernetes.io/projected/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581-kube-api-access-mvnkg\") pod \"route-controller-manager-6576b87f9c-b4wwd\" (UID: \"e5ea287e-5a20-4798-8f4b-4f2d0e5b1581\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.948110 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-v44cl" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.953442 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwgfw\" (UniqueName: \"kubernetes.io/projected/5620c8e3-4592-4189-b074-4ea40e9447ff-kube-api-access-rwgfw\") pod \"router-default-5444994796-kcb5k\" (UID: \"5620c8e3-4592-4189-b074-4ea40e9447ff\") " pod="openshift-ingress/router-default-5444994796-kcb5k" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.955937 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-kcb5k" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.972629 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmgqz\" (UniqueName: \"kubernetes.io/projected/e847b6c6-710f-4a76-9887-bac022f8de18-kube-api-access-vmgqz\") pod \"openshift-config-operator-7777fb866f-vv7rp\" (UID: \"e847b6c6-710f-4a76-9887-bac022f8de18\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vv7rp" Jan 10 16:28:32 crc kubenswrapper[5036]: I0110 16:28:32.991305 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/253187aa-7581-4eb5-ab49-bc4d53a47810-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-s8dts\" (UID: \"253187aa-7581-4eb5-ab49-bc4d53a47810\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8dts" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.011656 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.032722 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.052368 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.059469 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-45j5v" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.072924 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.101663 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.120532 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.132126 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.152699 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.172828 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.184635 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vv7rp" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.192170 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.212180 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.221808 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.222027 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.222071 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 10 16:28:33 crc kubenswrapper[5036]: E0110 16:28:33.222124 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:49.222080781 +0000 UTC m=+51.092316315 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.222452 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.232969 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.233050 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.240578 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8dts" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.253384 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.271690 5036 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.291657 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.311507 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.324236 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.331509 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.346048 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cxgw\" (UniqueName: \"kubernetes.io/projected/2a27702d-fd8a-4b89-883e-a2250c0cb1a9-kube-api-access-8cxgw\") pod \"openshift-apiserver-operator-796bbdcf4f-r5pns\" (UID: \"2a27702d-fd8a-4b89-883e-a2250c0cb1a9\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r5pns" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.352636 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.353913 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnpq2\" (UniqueName: \"kubernetes.io/projected/ec9ab704-1f8b-473b-bbe2-3c09d04991cd-kube-api-access-xnpq2\") pod \"apiserver-76f77b778f-7lh8w\" (UID: \"ec9ab704-1f8b-473b-bbe2-3c09d04991cd\") " pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.354745 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmdnp\" (UniqueName: \"kubernetes.io/projected/792beb3d-c532-4c80-8ab7-3024b5db8512-kube-api-access-pmdnp\") pod \"controller-manager-879f6c89f-9r9hf\" (UID: \"792beb3d-c532-4c80-8ab7-3024b5db8512\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.372142 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.391549 5036 request.go:700] Waited for 1.942940186s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Dcanary-serving-cert&limit=500&resourceVersion=0 Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.394423 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.412373 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.438868 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.452520 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.484496 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.491987 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.514574 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.527226 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.533949 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.542872 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.560386 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.565129 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.568728 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.575456 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.583396 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.597049 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.611741 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.617779 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r5pns" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.620398 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.621669 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.674550 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjg26\" (UniqueName: \"kubernetes.io/projected/ef797e07-14de-4b71-af82-bd8304e658dc-kube-api-access-hjg26\") pod \"service-ca-9c57cc56f-dlncf\" (UID: \"ef797e07-14de-4b71-af82-bd8304e658dc\") " pod="openshift-service-ca/service-ca-9c57cc56f-dlncf" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.683438 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2rvph"] Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.695737 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbmlj\" (UniqueName: \"kubernetes.io/projected/515d0795-5463-4e54-b0d2-ee5b16994fa4-kube-api-access-gbmlj\") pod \"multus-admission-controller-857f4d67dd-z6hnf\" (UID: \"515d0795-5463-4e54-b0d2-ee5b16994fa4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-z6hnf" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.703609 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq7wp\" (UniqueName: \"kubernetes.io/projected/a1b7d7be-9cb2-4817-89a0-ae511aa199ea-kube-api-access-rq7wp\") pod \"packageserver-d55dfcdfc-zdtsj\" (UID: \"a1b7d7be-9cb2-4817-89a0-ae511aa199ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdtsj" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.708940 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/615043bb-4f5a-497c-9d23-4ef7fe1b7ac8-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rv7wh\" (UID: \"615043bb-4f5a-497c-9d23-4ef7fe1b7ac8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rv7wh" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.711859 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-lx6q9"] Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.720297 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-dlncf" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.728876 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b6jc\" (UniqueName: \"kubernetes.io/projected/88184b8f-9aed-4978-bfbb-7054dd96550e-kube-api-access-7b6jc\") pod \"ingress-operator-5b745b69d9-8vgqk\" (UID: \"88184b8f-9aed-4978-bfbb-7054dd96550e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8vgqk" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.743137 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 10 16:28:33 crc kubenswrapper[5036]: W0110 16:28:33.746076 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77b7a0ba_113f_4f0a_a6c5_f5850de92916.slice/crio-defb109369311380648c3d4976a6f46f4bb851ab200cf619e023163c4e66dc09 WatchSource:0}: Error finding container defb109369311380648c3d4976a6f46f4bb851ab200cf619e023163c4e66dc09: Status 404 returned error can't find the container with id defb109369311380648c3d4976a6f46f4bb851ab200cf619e023163c4e66dc09 Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.769452 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jrh9\" (UniqueName: \"kubernetes.io/projected/51291205-9eaf-455b-aa3b-a261761c8c06-kube-api-access-4jrh9\") pod \"machine-config-controller-84d6567774-pjrh2\" (UID: \"51291205-9eaf-455b-aa3b-a261761c8c06\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjrh2" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.771444 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x6vb\" (UniqueName: \"kubernetes.io/projected/acb54813-4c4d-4b94-9337-19541ac1980e-kube-api-access-7x6vb\") pod \"control-plane-machine-set-operator-78cbb6b69f-ptztt\" (UID: \"acb54813-4c4d-4b94-9337-19541ac1980e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptztt" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.791237 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvdc6\" (UniqueName: \"kubernetes.io/projected/09ff12a0-dcd3-465b-a051-1f0216f9ba57-kube-api-access-pvdc6\") pod \"cluster-samples-operator-665b6dd947-fnq8l\" (UID: \"09ff12a0-dcd3-465b-a051-1f0216f9ba57\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnq8l" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.822169 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/88184b8f-9aed-4978-bfbb-7054dd96550e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8vgqk\" (UID: \"88184b8f-9aed-4978-bfbb-7054dd96550e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8vgqk" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.863093 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnq8l" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.864079 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-kcb5k" event={"ID":"5620c8e3-4592-4189-b074-4ea40e9447ff","Type":"ContainerStarted","Data":"73bad080135342f993a5f9857012cdafa01d5316748578cc3b3869d2c0b1b5ff"} Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.864119 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-kcb5k" event={"ID":"5620c8e3-4592-4189-b074-4ea40e9447ff","Type":"ContainerStarted","Data":"0b3617c33806bae16ca573b747ecc1da5faa9bf83a1eed692bc1d03fb5435a70"} Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.868225 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zkxb5" event={"ID":"f6db2aeb-98ec-4b01-83b0-a0dc2816bf48","Type":"ContainerStarted","Data":"b1a93fc611ab3b4a1a9ff50cd2ee7acb65876721c967cc877b2645152cdc7756"} Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.869275 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2rvph" event={"ID":"77b7a0ba-113f-4f0a-a6c5-f5850de92916","Type":"ContainerStarted","Data":"defb109369311380648c3d4976a6f46f4bb851ab200cf619e023163c4e66dc09"} Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.882486 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptztt" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.887987 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rv7wh" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.894217 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-lx6q9" event={"ID":"05b7892c-5976-4209-821e-be876e2d43a1","Type":"ContainerStarted","Data":"2aecdc1ccddb40e57f0a970607b24baf65bbeb109a284c5b06326e7b2e261094"} Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.895469 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjrh2" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.901961 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.915327 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdtsj" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942092 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5918ea44-c357-430a-90be-72cc7da6348f-etcd-ca\") pod \"etcd-operator-b45778765-pc2wp\" (UID: \"5918ea44-c357-430a-90be-72cc7da6348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pc2wp" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942134 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/578a5db9-0ce8-4eda-8c50-c779f29f817f-config\") pod \"kube-apiserver-operator-766d6c64bb-d2wm7\" (UID: \"578a5db9-0ce8-4eda-8c50-c779f29f817f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2wm7" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942155 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5bx7\" (UniqueName: \"kubernetes.io/projected/6bf48c2a-e148-4c19-8fa7-d60550edaad5-kube-api-access-h5bx7\") pod \"migrator-59844c95c7-65jzk\" (UID: \"6bf48c2a-e148-4c19-8fa7-d60550edaad5\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-65jzk" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942192 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/21dc0ffe-3f39-4c16-8b98-8bb475342db9-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lr2qm\" (UID: \"21dc0ffe-3f39-4c16-8b98-8bb475342db9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lr2qm" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942211 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml929\" (UniqueName: \"kubernetes.io/projected/8ee18389-eb4f-4c7b-98bf-2f9785f21ce4-kube-api-access-ml929\") pod \"collect-profiles-29467695-kv4q7\" (UID: \"8ee18389-eb4f-4c7b-98bf-2f9785f21ce4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467695-kv4q7" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942226 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1559e8b-1a4d-4929-80cc-235f23048dd6-console-serving-cert\") pod \"console-f9d7485db-bvg6n\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942242 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9274cc77-f1fa-4169-8bb6-9ba783c69440-srv-cert\") pod \"olm-operator-6b444d44fb-g9f5x\" (UID: \"9274cc77-f1fa-4169-8bb6-9ba783c69440\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g9f5x" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942260 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1559e8b-1a4d-4929-80cc-235f23048dd6-console-config\") pod \"console-f9d7485db-bvg6n\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942283 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1559e8b-1a4d-4929-80cc-235f23048dd6-console-oauth-config\") pod \"console-f9d7485db-bvg6n\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942314 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77f50c9f-6757-4d97-afae-152cd032f789-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zs55l\" (UID: \"77f50c9f-6757-4d97-afae-152cd032f789\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zs55l" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942331 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d8d9ae9f-271e-402d-8ec6-a2e25057090e-registry-tls\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942346 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ee18389-eb4f-4c7b-98bf-2f9785f21ce4-secret-volume\") pod \"collect-profiles-29467695-kv4q7\" (UID: \"8ee18389-eb4f-4c7b-98bf-2f9785f21ce4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467695-kv4q7" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942361 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5918ea44-c357-430a-90be-72cc7da6348f-etcd-client\") pod \"etcd-operator-b45778765-pc2wp\" (UID: \"5918ea44-c357-430a-90be-72cc7da6348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pc2wp" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942377 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gw9w\" (UniqueName: \"kubernetes.io/projected/a0bc40ca-fd61-4885-871b-3a7964df225a-kube-api-access-7gw9w\") pod \"marketplace-operator-79b997595-8k59b\" (UID: \"a0bc40ca-fd61-4885-871b-3a7964df225a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942414 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlljj\" (UniqueName: \"kubernetes.io/projected/5918ea44-c357-430a-90be-72cc7da6348f-kube-api-access-wlljj\") pod \"etcd-operator-b45778765-pc2wp\" (UID: \"5918ea44-c357-430a-90be-72cc7da6348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pc2wp" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942433 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0bc40ca-fd61-4885-871b-3a7964df225a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8k59b\" (UID: \"a0bc40ca-fd61-4885-871b-3a7964df225a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942452 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d8d9ae9f-271e-402d-8ec6-a2e25057090e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942469 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/adb37ef9-7f35-4580-929d-0883cc3ca91a-profile-collector-cert\") pod \"catalog-operator-68c6474976-ckc7q\" (UID: \"adb37ef9-7f35-4580-929d-0883cc3ca91a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckc7q" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942493 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d8d9ae9f-271e-402d-8ec6-a2e25057090e-registry-certificates\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942509 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/578a5db9-0ce8-4eda-8c50-c779f29f817f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-d2wm7\" (UID: \"578a5db9-0ce8-4eda-8c50-c779f29f817f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2wm7" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942533 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pl97\" (UniqueName: \"kubernetes.io/projected/21dc0ffe-3f39-4c16-8b98-8bb475342db9-kube-api-access-4pl97\") pod \"package-server-manager-789f6589d5-lr2qm\" (UID: \"21dc0ffe-3f39-4c16-8b98-8bb475342db9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lr2qm" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942550 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drfm4\" (UniqueName: \"kubernetes.io/projected/d8d9ae9f-271e-402d-8ec6-a2e25057090e-kube-api-access-drfm4\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942566 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjm8k\" (UniqueName: \"kubernetes.io/projected/9274cc77-f1fa-4169-8bb6-9ba783c69440-kube-api-access-jjm8k\") pod \"olm-operator-6b444d44fb-g9f5x\" (UID: \"9274cc77-f1fa-4169-8bb6-9ba783c69440\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g9f5x" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942601 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgqkh\" (UniqueName: \"kubernetes.io/projected/d1559e8b-1a4d-4929-80cc-235f23048dd6-kube-api-access-kgqkh\") pod \"console-f9d7485db-bvg6n\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942664 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/578a5db9-0ce8-4eda-8c50-c779f29f817f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-d2wm7\" (UID: \"578a5db9-0ce8-4eda-8c50-c779f29f817f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2wm7" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942693 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1559e8b-1a4d-4929-80cc-235f23048dd6-trusted-ca-bundle\") pod \"console-f9d7485db-bvg6n\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942709 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8d9ae9f-271e-402d-8ec6-a2e25057090e-trusted-ca\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.942743 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfjzn\" (UniqueName: \"kubernetes.io/projected/adb37ef9-7f35-4580-929d-0883cc3ca91a-kube-api-access-rfjzn\") pod \"catalog-operator-68c6474976-ckc7q\" (UID: \"adb37ef9-7f35-4580-929d-0883cc3ca91a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckc7q" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.945084 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d8d9ae9f-271e-402d-8ec6-a2e25057090e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.945557 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8d9ae9f-271e-402d-8ec6-a2e25057090e-bound-sa-token\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.945576 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1559e8b-1a4d-4929-80cc-235f23048dd6-service-ca\") pod \"console-f9d7485db-bvg6n\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.945710 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a0bc40ca-fd61-4885-871b-3a7964df225a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8k59b\" (UID: \"a0bc40ca-fd61-4885-871b-3a7964df225a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.945758 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ee18389-eb4f-4c7b-98bf-2f9785f21ce4-config-volume\") pod \"collect-profiles-29467695-kv4q7\" (UID: \"8ee18389-eb4f-4c7b-98bf-2f9785f21ce4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467695-kv4q7" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.945793 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1559e8b-1a4d-4929-80cc-235f23048dd6-oauth-serving-cert\") pod \"console-f9d7485db-bvg6n\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.946545 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5918ea44-c357-430a-90be-72cc7da6348f-serving-cert\") pod \"etcd-operator-b45778765-pc2wp\" (UID: \"5918ea44-c357-430a-90be-72cc7da6348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pc2wp" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.946634 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.946966 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/adb37ef9-7f35-4580-929d-0883cc3ca91a-srv-cert\") pod \"catalog-operator-68c6474976-ckc7q\" (UID: \"adb37ef9-7f35-4580-929d-0883cc3ca91a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckc7q" Jan 10 16:28:33 crc kubenswrapper[5036]: E0110 16:28:33.947718 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:34.447702512 +0000 UTC m=+36.317938006 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.947945 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5918ea44-c357-430a-90be-72cc7da6348f-config\") pod \"etcd-operator-b45778765-pc2wp\" (UID: \"5918ea44-c357-430a-90be-72cc7da6348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pc2wp" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.948105 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77f50c9f-6757-4d97-afae-152cd032f789-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zs55l\" (UID: \"77f50c9f-6757-4d97-afae-152cd032f789\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zs55l" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.948129 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5918ea44-c357-430a-90be-72cc7da6348f-etcd-service-ca\") pod \"etcd-operator-b45778765-pc2wp\" (UID: \"5918ea44-c357-430a-90be-72cc7da6348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pc2wp" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.948361 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77f50c9f-6757-4d97-afae-152cd032f789-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zs55l\" (UID: \"77f50c9f-6757-4d97-afae-152cd032f789\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zs55l" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.949045 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9274cc77-f1fa-4169-8bb6-9ba783c69440-profile-collector-cert\") pod \"olm-operator-6b444d44fb-g9f5x\" (UID: \"9274cc77-f1fa-4169-8bb6-9ba783c69440\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g9f5x" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.950547 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8vgqk" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.961777 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-kcb5k" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.971241 5036 patch_prober.go:28] interesting pod/router-default-5444994796-kcb5k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 10 16:28:33 crc kubenswrapper[5036]: [-]has-synced failed: reason withheld Jan 10 16:28:33 crc kubenswrapper[5036]: [+]process-running ok Jan 10 16:28:33 crc kubenswrapper[5036]: healthz check failed Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.971319 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcb5k" podUID="5620c8e3-4592-4189-b074-4ea40e9447ff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.972104 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd"] Jan 10 16:28:33 crc kubenswrapper[5036]: I0110 16:28:33.983573 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-z6hnf" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.053956 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.054214 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d8d9ae9f-271e-402d-8ec6-a2e25057090e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:34 crc kubenswrapper[5036]: E0110 16:28:34.054607 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:34.554567673 +0000 UTC m=+36.424803167 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.055057 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d8d9ae9f-271e-402d-8ec6-a2e25057090e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.063792 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/79a5550c-f653-48bd-9199-53843401d87e-mountpoint-dir\") pod \"csi-hostpathplugin-lc5jj\" (UID: \"79a5550c-f653-48bd-9199-53843401d87e\") " pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.063933 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/adb37ef9-7f35-4580-929d-0883cc3ca91a-profile-collector-cert\") pod \"catalog-operator-68c6474976-ckc7q\" (UID: \"adb37ef9-7f35-4580-929d-0883cc3ca91a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckc7q" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.063991 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0e8b604d-42ac-4972-affe-5abed1bf54d5-metrics-tls\") pod \"dns-default-w6mpm\" (UID: \"0e8b604d-42ac-4972-affe-5abed1bf54d5\") " pod="openshift-dns/dns-default-w6mpm" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064011 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d8d9ae9f-271e-402d-8ec6-a2e25057090e-registry-certificates\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064032 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/eea6717e-7414-40f3-80c9-92838e761eba-images\") pod \"machine-config-operator-74547568cd-jj9tv\" (UID: \"eea6717e-7414-40f3-80c9-92838e761eba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jj9tv" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064067 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/578a5db9-0ce8-4eda-8c50-c779f29f817f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-d2wm7\" (UID: \"578a5db9-0ce8-4eda-8c50-c779f29f817f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2wm7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064083 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzplg\" (UniqueName: \"kubernetes.io/projected/eea6717e-7414-40f3-80c9-92838e761eba-kube-api-access-pzplg\") pod \"machine-config-operator-74547568cd-jj9tv\" (UID: \"eea6717e-7414-40f3-80c9-92838e761eba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jj9tv" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064103 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/56edcbe7-428c-4373-928d-b2fdf97a0a3a-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-lt5rc\" (UID: \"56edcbe7-428c-4373-928d-b2fdf97a0a3a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064135 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pl97\" (UniqueName: \"kubernetes.io/projected/21dc0ffe-3f39-4c16-8b98-8bb475342db9-kube-api-access-4pl97\") pod \"package-server-manager-789f6589d5-lr2qm\" (UID: \"21dc0ffe-3f39-4c16-8b98-8bb475342db9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lr2qm" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064160 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/79a5550c-f653-48bd-9199-53843401d87e-socket-dir\") pod \"csi-hostpathplugin-lc5jj\" (UID: \"79a5550c-f653-48bd-9199-53843401d87e\") " pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064178 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/79a5550c-f653-48bd-9199-53843401d87e-plugins-dir\") pod \"csi-hostpathplugin-lc5jj\" (UID: \"79a5550c-f653-48bd-9199-53843401d87e\") " pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064195 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34ba934a-3f28-4814-988e-f75e79084c14-serving-cert\") pod \"service-ca-operator-777779d784-2kjbd\" (UID: \"34ba934a-3f28-4814-988e-f75e79084c14\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kjbd" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064225 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drfm4\" (UniqueName: \"kubernetes.io/projected/d8d9ae9f-271e-402d-8ec6-a2e25057090e-kube-api-access-drfm4\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064243 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d17b6c89-3a6e-4b30-9363-47232ae52829-config\") pod \"console-operator-58897d9998-5hjq7\" (UID: \"d17b6c89-3a6e-4b30-9363-47232ae52829\") " pod="openshift-console-operator/console-operator-58897d9998-5hjq7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064270 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4mkt\" (UniqueName: \"kubernetes.io/projected/56edcbe7-428c-4373-928d-b2fdf97a0a3a-kube-api-access-h4mkt\") pod \"cni-sysctl-allowlist-ds-lt5rc\" (UID: \"56edcbe7-428c-4373-928d-b2fdf97a0a3a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064294 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eea6717e-7414-40f3-80c9-92838e761eba-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jj9tv\" (UID: \"eea6717e-7414-40f3-80c9-92838e761eba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jj9tv" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064315 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjm8k\" (UniqueName: \"kubernetes.io/projected/9274cc77-f1fa-4169-8bb6-9ba783c69440-kube-api-access-jjm8k\") pod \"olm-operator-6b444d44fb-g9f5x\" (UID: \"9274cc77-f1fa-4169-8bb6-9ba783c69440\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g9f5x" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064335 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgqkh\" (UniqueName: \"kubernetes.io/projected/d1559e8b-1a4d-4929-80cc-235f23048dd6-kube-api-access-kgqkh\") pod \"console-f9d7485db-bvg6n\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064367 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e8b604d-42ac-4972-affe-5abed1bf54d5-config-volume\") pod \"dns-default-w6mpm\" (UID: \"0e8b604d-42ac-4972-affe-5abed1bf54d5\") " pod="openshift-dns/dns-default-w6mpm" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064399 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d17b6c89-3a6e-4b30-9363-47232ae52829-trusted-ca\") pod \"console-operator-58897d9998-5hjq7\" (UID: \"d17b6c89-3a6e-4b30-9363-47232ae52829\") " pod="openshift-console-operator/console-operator-58897d9998-5hjq7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064425 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/578a5db9-0ce8-4eda-8c50-c779f29f817f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-d2wm7\" (UID: \"578a5db9-0ce8-4eda-8c50-c779f29f817f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2wm7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064443 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6g5g\" (UniqueName: \"kubernetes.io/projected/900efb57-f66f-4bd4-a99c-542aebd6b412-kube-api-access-f6g5g\") pod \"kube-storage-version-migrator-operator-b67b599dd-bcpmt\" (UID: \"900efb57-f66f-4bd4-a99c-542aebd6b412\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bcpmt" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064474 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8d9ae9f-271e-402d-8ec6-a2e25057090e-trusted-ca\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064495 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1559e8b-1a4d-4929-80cc-235f23048dd6-trusted-ca-bundle\") pod \"console-f9d7485db-bvg6n\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064510 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d17b6c89-3a6e-4b30-9363-47232ae52829-serving-cert\") pod \"console-operator-58897d9998-5hjq7\" (UID: \"d17b6c89-3a6e-4b30-9363-47232ae52829\") " pod="openshift-console-operator/console-operator-58897d9998-5hjq7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064558 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfjzn\" (UniqueName: \"kubernetes.io/projected/adb37ef9-7f35-4580-929d-0883cc3ca91a-kube-api-access-rfjzn\") pod \"catalog-operator-68c6474976-ckc7q\" (UID: \"adb37ef9-7f35-4580-929d-0883cc3ca91a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckc7q" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064591 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c92qz\" (UniqueName: \"kubernetes.io/projected/6f0c5524-3a8c-4524-adec-7c1b61c6feaa-kube-api-access-c92qz\") pod \"ingress-canary-4qfgz\" (UID: \"6f0c5524-3a8c-4524-adec-7c1b61c6feaa\") " pod="openshift-ingress-canary/ingress-canary-4qfgz" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064625 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d8d9ae9f-271e-402d-8ec6-a2e25057090e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064658 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/900efb57-f66f-4bd4-a99c-542aebd6b412-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bcpmt\" (UID: \"900efb57-f66f-4bd4-a99c-542aebd6b412\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bcpmt" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064703 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1b66cca7-7c58-4f1f-9f81-63c9ff9825ca-node-bootstrap-token\") pod \"machine-config-server-c8nxz\" (UID: \"1b66cca7-7c58-4f1f-9f81-63c9ff9825ca\") " pod="openshift-machine-config-operator/machine-config-server-c8nxz" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064729 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8d9ae9f-271e-402d-8ec6-a2e25057090e-bound-sa-token\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064748 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1559e8b-1a4d-4929-80cc-235f23048dd6-service-ca\") pod \"console-f9d7485db-bvg6n\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064769 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a0bc40ca-fd61-4885-871b-3a7964df225a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8k59b\" (UID: \"a0bc40ca-fd61-4885-871b-3a7964df225a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064791 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/79a5550c-f653-48bd-9199-53843401d87e-csi-data-dir\") pod \"csi-hostpathplugin-lc5jj\" (UID: \"79a5550c-f653-48bd-9199-53843401d87e\") " pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064813 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ee18389-eb4f-4c7b-98bf-2f9785f21ce4-config-volume\") pod \"collect-profiles-29467695-kv4q7\" (UID: \"8ee18389-eb4f-4c7b-98bf-2f9785f21ce4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467695-kv4q7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064833 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5918ea44-c357-430a-90be-72cc7da6348f-serving-cert\") pod \"etcd-operator-b45778765-pc2wp\" (UID: \"5918ea44-c357-430a-90be-72cc7da6348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pc2wp" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064851 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1559e8b-1a4d-4929-80cc-235f23048dd6-oauth-serving-cert\") pod \"console-f9d7485db-bvg6n\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064871 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kddfb\" (UniqueName: \"kubernetes.io/projected/d17b6c89-3a6e-4b30-9363-47232ae52829-kube-api-access-kddfb\") pod \"console-operator-58897d9998-5hjq7\" (UID: \"d17b6c89-3a6e-4b30-9363-47232ae52829\") " pod="openshift-console-operator/console-operator-58897d9998-5hjq7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.064986 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.065013 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlxfh\" (UniqueName: \"kubernetes.io/projected/1b66cca7-7c58-4f1f-9f81-63c9ff9825ca-kube-api-access-jlxfh\") pod \"machine-config-server-c8nxz\" (UID: \"1b66cca7-7c58-4f1f-9f81-63c9ff9825ca\") " pod="openshift-machine-config-operator/machine-config-server-c8nxz" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.065045 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/adb37ef9-7f35-4580-929d-0883cc3ca91a-srv-cert\") pod \"catalog-operator-68c6474976-ckc7q\" (UID: \"adb37ef9-7f35-4580-929d-0883cc3ca91a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckc7q" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.065073 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ba934a-3f28-4814-988e-f75e79084c14-config\") pod \"service-ca-operator-777779d784-2kjbd\" (UID: \"34ba934a-3f28-4814-988e-f75e79084c14\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kjbd" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.065092 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbx42\" (UniqueName: \"kubernetes.io/projected/0e8b604d-42ac-4972-affe-5abed1bf54d5-kube-api-access-pbx42\") pod \"dns-default-w6mpm\" (UID: \"0e8b604d-42ac-4972-affe-5abed1bf54d5\") " pod="openshift-dns/dns-default-w6mpm" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.065114 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5918ea44-c357-430a-90be-72cc7da6348f-config\") pod \"etcd-operator-b45778765-pc2wp\" (UID: \"5918ea44-c357-430a-90be-72cc7da6348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pc2wp" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.065140 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77f50c9f-6757-4d97-afae-152cd032f789-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zs55l\" (UID: \"77f50c9f-6757-4d97-afae-152cd032f789\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zs55l" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.065156 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5918ea44-c357-430a-90be-72cc7da6348f-etcd-service-ca\") pod \"etcd-operator-b45778765-pc2wp\" (UID: \"5918ea44-c357-430a-90be-72cc7da6348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pc2wp" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.065173 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1b66cca7-7c58-4f1f-9f81-63c9ff9825ca-certs\") pod \"machine-config-server-c8nxz\" (UID: \"1b66cca7-7c58-4f1f-9f81-63c9ff9825ca\") " pod="openshift-machine-config-operator/machine-config-server-c8nxz" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.065199 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77f50c9f-6757-4d97-afae-152cd032f789-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zs55l\" (UID: \"77f50c9f-6757-4d97-afae-152cd032f789\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zs55l" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.065215 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9274cc77-f1fa-4169-8bb6-9ba783c69440-profile-collector-cert\") pod \"olm-operator-6b444d44fb-g9f5x\" (UID: \"9274cc77-f1fa-4169-8bb6-9ba783c69440\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g9f5x" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.065240 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v26zj\" (UniqueName: \"kubernetes.io/projected/22345551-25b1-48ef-8bfb-c4b4c10170fd-kube-api-access-v26zj\") pod \"downloads-7954f5f757-km6m5\" (UID: \"22345551-25b1-48ef-8bfb-c4b4c10170fd\") " pod="openshift-console/downloads-7954f5f757-km6m5" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.065258 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/79a5550c-f653-48bd-9199-53843401d87e-registration-dir\") pod \"csi-hostpathplugin-lc5jj\" (UID: \"79a5550c-f653-48bd-9199-53843401d87e\") " pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.065320 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5918ea44-c357-430a-90be-72cc7da6348f-etcd-ca\") pod \"etcd-operator-b45778765-pc2wp\" (UID: \"5918ea44-c357-430a-90be-72cc7da6348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pc2wp" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.065335 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eea6717e-7414-40f3-80c9-92838e761eba-proxy-tls\") pod \"machine-config-operator-74547568cd-jj9tv\" (UID: \"eea6717e-7414-40f3-80c9-92838e761eba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jj9tv" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.065356 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5bx7\" (UniqueName: \"kubernetes.io/projected/6bf48c2a-e148-4c19-8fa7-d60550edaad5-kube-api-access-h5bx7\") pod \"migrator-59844c95c7-65jzk\" (UID: \"6bf48c2a-e148-4c19-8fa7-d60550edaad5\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-65jzk" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.065372 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/578a5db9-0ce8-4eda-8c50-c779f29f817f-config\") pod \"kube-apiserver-operator-766d6c64bb-d2wm7\" (UID: \"578a5db9-0ce8-4eda-8c50-c779f29f817f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2wm7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.065388 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/56edcbe7-428c-4373-928d-b2fdf97a0a3a-ready\") pod \"cni-sysctl-allowlist-ds-lt5rc\" (UID: \"56edcbe7-428c-4373-928d-b2fdf97a0a3a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.065408 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/21dc0ffe-3f39-4c16-8b98-8bb475342db9-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lr2qm\" (UID: \"21dc0ffe-3f39-4c16-8b98-8bb475342db9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lr2qm" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.065426 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml929\" (UniqueName: \"kubernetes.io/projected/8ee18389-eb4f-4c7b-98bf-2f9785f21ce4-kube-api-access-ml929\") pod \"collect-profiles-29467695-kv4q7\" (UID: \"8ee18389-eb4f-4c7b-98bf-2f9785f21ce4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467695-kv4q7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.065441 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxzjh\" (UniqueName: \"kubernetes.io/projected/79a5550c-f653-48bd-9199-53843401d87e-kube-api-access-pxzjh\") pod \"csi-hostpathplugin-lc5jj\" (UID: \"79a5550c-f653-48bd-9199-53843401d87e\") " pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.065462 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq9qb\" (UniqueName: \"kubernetes.io/projected/34ba934a-3f28-4814-988e-f75e79084c14-kube-api-access-wq9qb\") pod \"service-ca-operator-777779d784-2kjbd\" (UID: \"34ba934a-3f28-4814-988e-f75e79084c14\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kjbd" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.065483 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/900efb57-f66f-4bd4-a99c-542aebd6b412-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bcpmt\" (UID: \"900efb57-f66f-4bd4-a99c-542aebd6b412\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bcpmt" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.068897 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1559e8b-1a4d-4929-80cc-235f23048dd6-console-serving-cert\") pod \"console-f9d7485db-bvg6n\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.068972 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9274cc77-f1fa-4169-8bb6-9ba783c69440-srv-cert\") pod \"olm-operator-6b444d44fb-g9f5x\" (UID: \"9274cc77-f1fa-4169-8bb6-9ba783c69440\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g9f5x" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.068998 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1559e8b-1a4d-4929-80cc-235f23048dd6-console-config\") pod \"console-f9d7485db-bvg6n\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.069028 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/56edcbe7-428c-4373-928d-b2fdf97a0a3a-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-lt5rc\" (UID: \"56edcbe7-428c-4373-928d-b2fdf97a0a3a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.069078 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1559e8b-1a4d-4929-80cc-235f23048dd6-console-oauth-config\") pod \"console-f9d7485db-bvg6n\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.069128 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f0c5524-3a8c-4524-adec-7c1b61c6feaa-cert\") pod \"ingress-canary-4qfgz\" (UID: \"6f0c5524-3a8c-4524-adec-7c1b61c6feaa\") " pod="openshift-ingress-canary/ingress-canary-4qfgz" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.069155 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77f50c9f-6757-4d97-afae-152cd032f789-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zs55l\" (UID: \"77f50c9f-6757-4d97-afae-152cd032f789\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zs55l" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.069199 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d8d9ae9f-271e-402d-8ec6-a2e25057090e-registry-tls\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.069220 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ee18389-eb4f-4c7b-98bf-2f9785f21ce4-secret-volume\") pod \"collect-profiles-29467695-kv4q7\" (UID: \"8ee18389-eb4f-4c7b-98bf-2f9785f21ce4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467695-kv4q7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.069257 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5918ea44-c357-430a-90be-72cc7da6348f-etcd-client\") pod \"etcd-operator-b45778765-pc2wp\" (UID: \"5918ea44-c357-430a-90be-72cc7da6348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pc2wp" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.069277 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gw9w\" (UniqueName: \"kubernetes.io/projected/a0bc40ca-fd61-4885-871b-3a7964df225a-kube-api-access-7gw9w\") pod \"marketplace-operator-79b997595-8k59b\" (UID: \"a0bc40ca-fd61-4885-871b-3a7964df225a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.069329 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0bc40ca-fd61-4885-871b-3a7964df225a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8k59b\" (UID: \"a0bc40ca-fd61-4885-871b-3a7964df225a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.069346 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlljj\" (UniqueName: \"kubernetes.io/projected/5918ea44-c357-430a-90be-72cc7da6348f-kube-api-access-wlljj\") pod \"etcd-operator-b45778765-pc2wp\" (UID: \"5918ea44-c357-430a-90be-72cc7da6348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pc2wp" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.073066 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1559e8b-1a4d-4929-80cc-235f23048dd6-console-config\") pod \"console-f9d7485db-bvg6n\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.073440 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1559e8b-1a4d-4929-80cc-235f23048dd6-oauth-serving-cert\") pod \"console-f9d7485db-bvg6n\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.074200 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1559e8b-1a4d-4929-80cc-235f23048dd6-service-ca\") pod \"console-f9d7485db-bvg6n\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.074606 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f"] Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.079328 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5918ea44-c357-430a-90be-72cc7da6348f-etcd-service-ca\") pod \"etcd-operator-b45778765-pc2wp\" (UID: \"5918ea44-c357-430a-90be-72cc7da6348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pc2wp" Jan 10 16:28:34 crc kubenswrapper[5036]: E0110 16:28:34.079751 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:34.579731014 +0000 UTC m=+36.449966508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.080060 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5918ea44-c357-430a-90be-72cc7da6348f-etcd-ca\") pod \"etcd-operator-b45778765-pc2wp\" (UID: \"5918ea44-c357-430a-90be-72cc7da6348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pc2wp" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.080875 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77f50c9f-6757-4d97-afae-152cd032f789-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zs55l\" (UID: \"77f50c9f-6757-4d97-afae-152cd032f789\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zs55l" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.081542 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1559e8b-1a4d-4929-80cc-235f23048dd6-console-serving-cert\") pod \"console-f9d7485db-bvg6n\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.082897 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a0bc40ca-fd61-4885-871b-3a7964df225a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8k59b\" (UID: \"a0bc40ca-fd61-4885-871b-3a7964df225a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.083571 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ee18389-eb4f-4c7b-98bf-2f9785f21ce4-config-volume\") pod \"collect-profiles-29467695-kv4q7\" (UID: \"8ee18389-eb4f-4c7b-98bf-2f9785f21ce4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467695-kv4q7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.090710 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/578a5db9-0ce8-4eda-8c50-c779f29f817f-config\") pod \"kube-apiserver-operator-766d6c64bb-d2wm7\" (UID: \"578a5db9-0ce8-4eda-8c50-c779f29f817f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2wm7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.091077 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5918ea44-c357-430a-90be-72cc7da6348f-config\") pod \"etcd-operator-b45778765-pc2wp\" (UID: \"5918ea44-c357-430a-90be-72cc7da6348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pc2wp" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.092910 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0bc40ca-fd61-4885-871b-3a7964df225a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8k59b\" (UID: \"a0bc40ca-fd61-4885-871b-3a7964df225a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.094033 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d8d9ae9f-271e-402d-8ec6-a2e25057090e-registry-tls\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.094498 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d8d9ae9f-271e-402d-8ec6-a2e25057090e-registry-certificates\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.096171 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8d9ae9f-271e-402d-8ec6-a2e25057090e-trusted-ca\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.099296 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1559e8b-1a4d-4929-80cc-235f23048dd6-trusted-ca-bundle\") pod \"console-f9d7485db-bvg6n\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.112257 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-45j5v"] Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.113411 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8vvs"] Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.114015 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1559e8b-1a4d-4929-80cc-235f23048dd6-console-oauth-config\") pod \"console-f9d7485db-bvg6n\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.114608 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/adb37ef9-7f35-4580-929d-0883cc3ca91a-profile-collector-cert\") pod \"catalog-operator-68c6474976-ckc7q\" (UID: \"adb37ef9-7f35-4580-929d-0883cc3ca91a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckc7q" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.117461 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9274cc77-f1fa-4169-8bb6-9ba783c69440-profile-collector-cert\") pod \"olm-operator-6b444d44fb-g9f5x\" (UID: \"9274cc77-f1fa-4169-8bb6-9ba783c69440\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g9f5x" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.117911 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5918ea44-c357-430a-90be-72cc7da6348f-serving-cert\") pod \"etcd-operator-b45778765-pc2wp\" (UID: \"5918ea44-c357-430a-90be-72cc7da6348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pc2wp" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.117926 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d8d9ae9f-271e-402d-8ec6-a2e25057090e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.118318 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ee18389-eb4f-4c7b-98bf-2f9785f21ce4-secret-volume\") pod \"collect-profiles-29467695-kv4q7\" (UID: \"8ee18389-eb4f-4c7b-98bf-2f9785f21ce4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467695-kv4q7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.118405 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9274cc77-f1fa-4169-8bb6-9ba783c69440-srv-cert\") pod \"olm-operator-6b444d44fb-g9f5x\" (UID: \"9274cc77-f1fa-4169-8bb6-9ba783c69440\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g9f5x" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.119852 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5918ea44-c357-430a-90be-72cc7da6348f-etcd-client\") pod \"etcd-operator-b45778765-pc2wp\" (UID: \"5918ea44-c357-430a-90be-72cc7da6348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pc2wp" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.120526 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/adb37ef9-7f35-4580-929d-0883cc3ca91a-srv-cert\") pod \"catalog-operator-68c6474976-ckc7q\" (UID: \"adb37ef9-7f35-4580-929d-0883cc3ca91a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckc7q" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.120702 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77f50c9f-6757-4d97-afae-152cd032f789-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zs55l\" (UID: \"77f50c9f-6757-4d97-afae-152cd032f789\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zs55l" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.121159 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/578a5db9-0ce8-4eda-8c50-c779f29f817f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-d2wm7\" (UID: \"578a5db9-0ce8-4eda-8c50-c779f29f817f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2wm7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.123194 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/21dc0ffe-3f39-4c16-8b98-8bb475342db9-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lr2qm\" (UID: \"21dc0ffe-3f39-4c16-8b98-8bb475342db9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lr2qm" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.124987 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlljj\" (UniqueName: \"kubernetes.io/projected/5918ea44-c357-430a-90be-72cc7da6348f-kube-api-access-wlljj\") pod \"etcd-operator-b45778765-pc2wp\" (UID: \"5918ea44-c357-430a-90be-72cc7da6348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pc2wp" Jan 10 16:28:34 crc kubenswrapper[5036]: W0110 16:28:34.128339 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5ea287e_5a20_4798_8f4b_4f2d0e5b1581.slice/crio-dfd8ebc1df64a9bb06bb11143df10f701fe3e2621e0c012af0da224fe04c2a1f WatchSource:0}: Error finding container dfd8ebc1df64a9bb06bb11143df10f701fe3e2621e0c012af0da224fe04c2a1f: Status 404 returned error can't find the container with id dfd8ebc1df64a9bb06bb11143df10f701fe3e2621e0c012af0da224fe04c2a1f Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.154840 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gw9w\" (UniqueName: \"kubernetes.io/projected/a0bc40ca-fd61-4885-871b-3a7964df225a-kube-api-access-7gw9w\") pod \"marketplace-operator-79b997595-8k59b\" (UID: \"a0bc40ca-fd61-4885-871b-3a7964df225a\") " pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.155425 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8d9ae9f-271e-402d-8ec6-a2e25057090e-bound-sa-token\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.158749 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-v44cl"] Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.171156 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5bx7\" (UniqueName: \"kubernetes.io/projected/6bf48c2a-e148-4c19-8fa7-d60550edaad5-kube-api-access-h5bx7\") pod \"migrator-59844c95c7-65jzk\" (UID: \"6bf48c2a-e148-4c19-8fa7-d60550edaad5\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-65jzk" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.173199 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vv7rp"] Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.177030 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-65jzk" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.178301 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.178549 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v26zj\" (UniqueName: \"kubernetes.io/projected/22345551-25b1-48ef-8bfb-c4b4c10170fd-kube-api-access-v26zj\") pod \"downloads-7954f5f757-km6m5\" (UID: \"22345551-25b1-48ef-8bfb-c4b4c10170fd\") " pod="openshift-console/downloads-7954f5f757-km6m5" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.178589 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/79a5550c-f653-48bd-9199-53843401d87e-registration-dir\") pod \"csi-hostpathplugin-lc5jj\" (UID: \"79a5550c-f653-48bd-9199-53843401d87e\") " pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.178637 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eea6717e-7414-40f3-80c9-92838e761eba-proxy-tls\") pod \"machine-config-operator-74547568cd-jj9tv\" (UID: \"eea6717e-7414-40f3-80c9-92838e761eba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jj9tv" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.178661 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/56edcbe7-428c-4373-928d-b2fdf97a0a3a-ready\") pod \"cni-sysctl-allowlist-ds-lt5rc\" (UID: \"56edcbe7-428c-4373-928d-b2fdf97a0a3a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.178723 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxzjh\" (UniqueName: \"kubernetes.io/projected/79a5550c-f653-48bd-9199-53843401d87e-kube-api-access-pxzjh\") pod \"csi-hostpathplugin-lc5jj\" (UID: \"79a5550c-f653-48bd-9199-53843401d87e\") " pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.178748 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq9qb\" (UniqueName: \"kubernetes.io/projected/34ba934a-3f28-4814-988e-f75e79084c14-kube-api-access-wq9qb\") pod \"service-ca-operator-777779d784-2kjbd\" (UID: \"34ba934a-3f28-4814-988e-f75e79084c14\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kjbd" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.178775 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/900efb57-f66f-4bd4-a99c-542aebd6b412-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bcpmt\" (UID: \"900efb57-f66f-4bd4-a99c-542aebd6b412\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bcpmt" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.178805 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/56edcbe7-428c-4373-928d-b2fdf97a0a3a-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-lt5rc\" (UID: \"56edcbe7-428c-4373-928d-b2fdf97a0a3a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.178834 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f0c5524-3a8c-4524-adec-7c1b61c6feaa-cert\") pod \"ingress-canary-4qfgz\" (UID: \"6f0c5524-3a8c-4524-adec-7c1b61c6feaa\") " pod="openshift-ingress-canary/ingress-canary-4qfgz" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.178876 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/79a5550c-f653-48bd-9199-53843401d87e-mountpoint-dir\") pod \"csi-hostpathplugin-lc5jj\" (UID: \"79a5550c-f653-48bd-9199-53843401d87e\") " pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.178901 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0e8b604d-42ac-4972-affe-5abed1bf54d5-metrics-tls\") pod \"dns-default-w6mpm\" (UID: \"0e8b604d-42ac-4972-affe-5abed1bf54d5\") " pod="openshift-dns/dns-default-w6mpm" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.178923 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/eea6717e-7414-40f3-80c9-92838e761eba-images\") pod \"machine-config-operator-74547568cd-jj9tv\" (UID: \"eea6717e-7414-40f3-80c9-92838e761eba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jj9tv" Jan 10 16:28:34 crc kubenswrapper[5036]: E0110 16:28:34.178985 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:34.678955679 +0000 UTC m=+36.549191173 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.179040 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/56edcbe7-428c-4373-928d-b2fdf97a0a3a-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-lt5rc\" (UID: \"56edcbe7-428c-4373-928d-b2fdf97a0a3a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.179070 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/79a5550c-f653-48bd-9199-53843401d87e-mountpoint-dir\") pod \"csi-hostpathplugin-lc5jj\" (UID: \"79a5550c-f653-48bd-9199-53843401d87e\") " pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.179389 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/56edcbe7-428c-4373-928d-b2fdf97a0a3a-ready\") pod \"cni-sysctl-allowlist-ds-lt5rc\" (UID: \"56edcbe7-428c-4373-928d-b2fdf97a0a3a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.179507 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/79a5550c-f653-48bd-9199-53843401d87e-registration-dir\") pod \"csi-hostpathplugin-lc5jj\" (UID: \"79a5550c-f653-48bd-9199-53843401d87e\") " pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.179940 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/56edcbe7-428c-4373-928d-b2fdf97a0a3a-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-lt5rc\" (UID: \"56edcbe7-428c-4373-928d-b2fdf97a0a3a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.180218 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzplg\" (UniqueName: \"kubernetes.io/projected/eea6717e-7414-40f3-80c9-92838e761eba-kube-api-access-pzplg\") pod \"machine-config-operator-74547568cd-jj9tv\" (UID: \"eea6717e-7414-40f3-80c9-92838e761eba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jj9tv" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.180283 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/79a5550c-f653-48bd-9199-53843401d87e-socket-dir\") pod \"csi-hostpathplugin-lc5jj\" (UID: \"79a5550c-f653-48bd-9199-53843401d87e\") " pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.180316 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/79a5550c-f653-48bd-9199-53843401d87e-plugins-dir\") pod \"csi-hostpathplugin-lc5jj\" (UID: \"79a5550c-f653-48bd-9199-53843401d87e\") " pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.180478 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34ba934a-3f28-4814-988e-f75e79084c14-serving-cert\") pod \"service-ca-operator-777779d784-2kjbd\" (UID: \"34ba934a-3f28-4814-988e-f75e79084c14\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kjbd" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.180523 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d17b6c89-3a6e-4b30-9363-47232ae52829-config\") pod \"console-operator-58897d9998-5hjq7\" (UID: \"d17b6c89-3a6e-4b30-9363-47232ae52829\") " pod="openshift-console-operator/console-operator-58897d9998-5hjq7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.180561 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4mkt\" (UniqueName: \"kubernetes.io/projected/56edcbe7-428c-4373-928d-b2fdf97a0a3a-kube-api-access-h4mkt\") pod \"cni-sysctl-allowlist-ds-lt5rc\" (UID: \"56edcbe7-428c-4373-928d-b2fdf97a0a3a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.180582 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eea6717e-7414-40f3-80c9-92838e761eba-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jj9tv\" (UID: \"eea6717e-7414-40f3-80c9-92838e761eba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jj9tv" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.180617 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e8b604d-42ac-4972-affe-5abed1bf54d5-config-volume\") pod \"dns-default-w6mpm\" (UID: \"0e8b604d-42ac-4972-affe-5abed1bf54d5\") " pod="openshift-dns/dns-default-w6mpm" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.180645 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d17b6c89-3a6e-4b30-9363-47232ae52829-trusted-ca\") pod \"console-operator-58897d9998-5hjq7\" (UID: \"d17b6c89-3a6e-4b30-9363-47232ae52829\") " pod="openshift-console-operator/console-operator-58897d9998-5hjq7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.180670 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6g5g\" (UniqueName: \"kubernetes.io/projected/900efb57-f66f-4bd4-a99c-542aebd6b412-kube-api-access-f6g5g\") pod \"kube-storage-version-migrator-operator-b67b599dd-bcpmt\" (UID: \"900efb57-f66f-4bd4-a99c-542aebd6b412\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bcpmt" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.180725 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d17b6c89-3a6e-4b30-9363-47232ae52829-serving-cert\") pod \"console-operator-58897d9998-5hjq7\" (UID: \"d17b6c89-3a6e-4b30-9363-47232ae52829\") " pod="openshift-console-operator/console-operator-58897d9998-5hjq7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.180766 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c92qz\" (UniqueName: \"kubernetes.io/projected/6f0c5524-3a8c-4524-adec-7c1b61c6feaa-kube-api-access-c92qz\") pod \"ingress-canary-4qfgz\" (UID: \"6f0c5524-3a8c-4524-adec-7c1b61c6feaa\") " pod="openshift-ingress-canary/ingress-canary-4qfgz" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.180784 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/900efb57-f66f-4bd4-a99c-542aebd6b412-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bcpmt\" (UID: \"900efb57-f66f-4bd4-a99c-542aebd6b412\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bcpmt" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.180804 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1b66cca7-7c58-4f1f-9f81-63c9ff9825ca-node-bootstrap-token\") pod \"machine-config-server-c8nxz\" (UID: \"1b66cca7-7c58-4f1f-9f81-63c9ff9825ca\") " pod="openshift-machine-config-operator/machine-config-server-c8nxz" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.180830 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/79a5550c-f653-48bd-9199-53843401d87e-csi-data-dir\") pod \"csi-hostpathplugin-lc5jj\" (UID: \"79a5550c-f653-48bd-9199-53843401d87e\") " pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.180850 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kddfb\" (UniqueName: \"kubernetes.io/projected/d17b6c89-3a6e-4b30-9363-47232ae52829-kube-api-access-kddfb\") pod \"console-operator-58897d9998-5hjq7\" (UID: \"d17b6c89-3a6e-4b30-9363-47232ae52829\") " pod="openshift-console-operator/console-operator-58897d9998-5hjq7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.180874 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.180891 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlxfh\" (UniqueName: \"kubernetes.io/projected/1b66cca7-7c58-4f1f-9f81-63c9ff9825ca-kube-api-access-jlxfh\") pod \"machine-config-server-c8nxz\" (UID: \"1b66cca7-7c58-4f1f-9f81-63c9ff9825ca\") " pod="openshift-machine-config-operator/machine-config-server-c8nxz" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.180912 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ba934a-3f28-4814-988e-f75e79084c14-config\") pod \"service-ca-operator-777779d784-2kjbd\" (UID: \"34ba934a-3f28-4814-988e-f75e79084c14\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kjbd" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.180932 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbx42\" (UniqueName: \"kubernetes.io/projected/0e8b604d-42ac-4972-affe-5abed1bf54d5-kube-api-access-pbx42\") pod \"dns-default-w6mpm\" (UID: \"0e8b604d-42ac-4972-affe-5abed1bf54d5\") " pod="openshift-dns/dns-default-w6mpm" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.180953 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1b66cca7-7c58-4f1f-9f81-63c9ff9825ca-certs\") pod \"machine-config-server-c8nxz\" (UID: \"1b66cca7-7c58-4f1f-9f81-63c9ff9825ca\") " pod="openshift-machine-config-operator/machine-config-server-c8nxz" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.185151 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/79a5550c-f653-48bd-9199-53843401d87e-csi-data-dir\") pod \"csi-hostpathplugin-lc5jj\" (UID: \"79a5550c-f653-48bd-9199-53843401d87e\") " pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.186386 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/eea6717e-7414-40f3-80c9-92838e761eba-images\") pod \"machine-config-operator-74547568cd-jj9tv\" (UID: \"eea6717e-7414-40f3-80c9-92838e761eba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jj9tv" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.186589 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/900efb57-f66f-4bd4-a99c-542aebd6b412-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bcpmt\" (UID: \"900efb57-f66f-4bd4-a99c-542aebd6b412\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bcpmt" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.186898 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/56edcbe7-428c-4373-928d-b2fdf97a0a3a-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-lt5rc\" (UID: \"56edcbe7-428c-4373-928d-b2fdf97a0a3a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.187032 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/79a5550c-f653-48bd-9199-53843401d87e-socket-dir\") pod \"csi-hostpathplugin-lc5jj\" (UID: \"79a5550c-f653-48bd-9199-53843401d87e\") " pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.187068 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/79a5550c-f653-48bd-9199-53843401d87e-plugins-dir\") pod \"csi-hostpathplugin-lc5jj\" (UID: \"79a5550c-f653-48bd-9199-53843401d87e\") " pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.187061 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ba934a-3f28-4814-988e-f75e79084c14-config\") pod \"service-ca-operator-777779d784-2kjbd\" (UID: \"34ba934a-3f28-4814-988e-f75e79084c14\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kjbd" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.187467 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eea6717e-7414-40f3-80c9-92838e761eba-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jj9tv\" (UID: \"eea6717e-7414-40f3-80c9-92838e761eba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jj9tv" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.187942 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e8b604d-42ac-4972-affe-5abed1bf54d5-config-volume\") pod \"dns-default-w6mpm\" (UID: \"0e8b604d-42ac-4972-affe-5abed1bf54d5\") " pod="openshift-dns/dns-default-w6mpm" Jan 10 16:28:34 crc kubenswrapper[5036]: E0110 16:28:34.189001 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:34.68897958 +0000 UTC m=+36.559215074 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.189308 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d17b6c89-3a6e-4b30-9363-47232ae52829-trusted-ca\") pod \"console-operator-58897d9998-5hjq7\" (UID: \"d17b6c89-3a6e-4b30-9363-47232ae52829\") " pod="openshift-console-operator/console-operator-58897d9998-5hjq7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.190490 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d17b6c89-3a6e-4b30-9363-47232ae52829-config\") pod \"console-operator-58897d9998-5hjq7\" (UID: \"d17b6c89-3a6e-4b30-9363-47232ae52829\") " pod="openshift-console-operator/console-operator-58897d9998-5hjq7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.197430 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8dts"] Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.212669 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1b66cca7-7c58-4f1f-9f81-63c9ff9825ca-certs\") pod \"machine-config-server-c8nxz\" (UID: \"1b66cca7-7c58-4f1f-9f81-63c9ff9825ca\") " pod="openshift-machine-config-operator/machine-config-server-c8nxz" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.212764 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f0c5524-3a8c-4524-adec-7c1b61c6feaa-cert\") pod \"ingress-canary-4qfgz\" (UID: \"6f0c5524-3a8c-4524-adec-7c1b61c6feaa\") " pod="openshift-ingress-canary/ingress-canary-4qfgz" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.214178 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0e8b604d-42ac-4972-affe-5abed1bf54d5-metrics-tls\") pod \"dns-default-w6mpm\" (UID: \"0e8b604d-42ac-4972-affe-5abed1bf54d5\") " pod="openshift-dns/dns-default-w6mpm" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.214817 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/900efb57-f66f-4bd4-a99c-542aebd6b412-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bcpmt\" (UID: \"900efb57-f66f-4bd4-a99c-542aebd6b412\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bcpmt" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.218100 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34ba934a-3f28-4814-988e-f75e79084c14-serving-cert\") pod \"service-ca-operator-777779d784-2kjbd\" (UID: \"34ba934a-3f28-4814-988e-f75e79084c14\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kjbd" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.218313 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77f50c9f-6757-4d97-afae-152cd032f789-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-zs55l\" (UID: \"77f50c9f-6757-4d97-afae-152cd032f789\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zs55l" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.218699 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eea6717e-7414-40f3-80c9-92838e761eba-proxy-tls\") pod \"machine-config-operator-74547568cd-jj9tv\" (UID: \"eea6717e-7414-40f3-80c9-92838e761eba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jj9tv" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.220968 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1b66cca7-7c58-4f1f-9f81-63c9ff9825ca-node-bootstrap-token\") pod \"machine-config-server-c8nxz\" (UID: \"1b66cca7-7c58-4f1f-9f81-63c9ff9825ca\") " pod="openshift-machine-config-operator/machine-config-server-c8nxz" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.221454 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjm8k\" (UniqueName: \"kubernetes.io/projected/9274cc77-f1fa-4169-8bb6-9ba783c69440-kube-api-access-jjm8k\") pod \"olm-operator-6b444d44fb-g9f5x\" (UID: \"9274cc77-f1fa-4169-8bb6-9ba783c69440\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g9f5x" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.222347 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d17b6c89-3a6e-4b30-9363-47232ae52829-serving-cert\") pod \"console-operator-58897d9998-5hjq7\" (UID: \"d17b6c89-3a6e-4b30-9363-47232ae52829\") " pod="openshift-console-operator/console-operator-58897d9998-5hjq7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.223259 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g9f5x" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.234006 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgqkh\" (UniqueName: \"kubernetes.io/projected/d1559e8b-1a4d-4929-80cc-235f23048dd6-kube-api-access-kgqkh\") pod \"console-f9d7485db-bvg6n\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.267185 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-pc2wp" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.270081 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfjzn\" (UniqueName: \"kubernetes.io/projected/adb37ef9-7f35-4580-929d-0883cc3ca91a-kube-api-access-rfjzn\") pod \"catalog-operator-68c6474976-ckc7q\" (UID: \"adb37ef9-7f35-4580-929d-0883cc3ca91a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckc7q" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.270109 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml929\" (UniqueName: \"kubernetes.io/projected/8ee18389-eb4f-4c7b-98bf-2f9785f21ce4-kube-api-access-ml929\") pod \"collect-profiles-29467695-kv4q7\" (UID: \"8ee18389-eb4f-4c7b-98bf-2f9785f21ce4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467695-kv4q7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.273321 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.281974 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:34 crc kubenswrapper[5036]: E0110 16:28:34.282517 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:34.78249808 +0000 UTC m=+36.652733574 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.291529 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckc7q" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.300165 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pl97\" (UniqueName: \"kubernetes.io/projected/21dc0ffe-3f39-4c16-8b98-8bb475342db9-kube-api-access-4pl97\") pod \"package-server-manager-789f6589d5-lr2qm\" (UID: \"21dc0ffe-3f39-4c16-8b98-8bb475342db9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lr2qm" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.316208 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/578a5db9-0ce8-4eda-8c50-c779f29f817f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-d2wm7\" (UID: \"578a5db9-0ce8-4eda-8c50-c779f29f817f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2wm7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.327322 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lr2qm" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.328319 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drfm4\" (UniqueName: \"kubernetes.io/projected/d8d9ae9f-271e-402d-8ec6-a2e25057090e-kube-api-access-drfm4\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.351187 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2wm7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.382991 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v26zj\" (UniqueName: \"kubernetes.io/projected/22345551-25b1-48ef-8bfb-c4b4c10170fd-kube-api-access-v26zj\") pod \"downloads-7954f5f757-km6m5\" (UID: \"22345551-25b1-48ef-8bfb-c4b4c10170fd\") " pod="openshift-console/downloads-7954f5f757-km6m5" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.388178 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:34 crc kubenswrapper[5036]: E0110 16:28:34.388781 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:34.888760945 +0000 UTC m=+36.758996429 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.401944 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxzjh\" (UniqueName: \"kubernetes.io/projected/79a5550c-f653-48bd-9199-53843401d87e-kube-api-access-pxzjh\") pod \"csi-hostpathplugin-lc5jj\" (UID: \"79a5550c-f653-48bd-9199-53843401d87e\") " pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.421961 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq9qb\" (UniqueName: \"kubernetes.io/projected/34ba934a-3f28-4814-988e-f75e79084c14-kube-api-access-wq9qb\") pod \"service-ca-operator-777779d784-2kjbd\" (UID: \"34ba934a-3f28-4814-988e-f75e79084c14\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kjbd" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.445750 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.447032 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6g5g\" (UniqueName: \"kubernetes.io/projected/900efb57-f66f-4bd4-a99c-542aebd6b412-kube-api-access-f6g5g\") pod \"kube-storage-version-migrator-operator-b67b599dd-bcpmt\" (UID: \"900efb57-f66f-4bd4-a99c-542aebd6b412\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bcpmt" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.464736 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlxfh\" (UniqueName: \"kubernetes.io/projected/1b66cca7-7c58-4f1f-9f81-63c9ff9825ca-kube-api-access-jlxfh\") pod \"machine-config-server-c8nxz\" (UID: \"1b66cca7-7c58-4f1f-9f81-63c9ff9825ca\") " pod="openshift-machine-config-operator/machine-config-server-c8nxz" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.481824 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzplg\" (UniqueName: \"kubernetes.io/projected/eea6717e-7414-40f3-80c9-92838e761eba-kube-api-access-pzplg\") pod \"machine-config-operator-74547568cd-jj9tv\" (UID: \"eea6717e-7414-40f3-80c9-92838e761eba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jj9tv" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.499153 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:34 crc kubenswrapper[5036]: E0110 16:28:34.499711 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:34.999656995 +0000 UTC m=+36.869892489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.504965 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zs55l" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.533252 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4mkt\" (UniqueName: \"kubernetes.io/projected/56edcbe7-428c-4373-928d-b2fdf97a0a3a-kube-api-access-h4mkt\") pod \"cni-sysctl-allowlist-ds-lt5rc\" (UID: \"56edcbe7-428c-4373-928d-b2fdf97a0a3a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.535119 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.540176 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kddfb\" (UniqueName: \"kubernetes.io/projected/d17b6c89-3a6e-4b30-9363-47232ae52829-kube-api-access-kddfb\") pod \"console-operator-58897d9998-5hjq7\" (UID: \"d17b6c89-3a6e-4b30-9363-47232ae52829\") " pod="openshift-console-operator/console-operator-58897d9998-5hjq7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.546835 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29467695-kv4q7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.575462 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c92qz\" (UniqueName: \"kubernetes.io/projected/6f0c5524-3a8c-4524-adec-7c1b61c6feaa-kube-api-access-c92qz\") pod \"ingress-canary-4qfgz\" (UID: \"6f0c5524-3a8c-4524-adec-7c1b61c6feaa\") " pod="openshift-ingress-canary/ingress-canary-4qfgz" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.586062 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbx42\" (UniqueName: \"kubernetes.io/projected/0e8b604d-42ac-4972-affe-5abed1bf54d5-kube-api-access-pbx42\") pod \"dns-default-w6mpm\" (UID: \"0e8b604d-42ac-4972-affe-5abed1bf54d5\") " pod="openshift-dns/dns-default-w6mpm" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.600847 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:34 crc kubenswrapper[5036]: E0110 16:28:34.601660 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:35.101645534 +0000 UTC m=+36.971881028 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.628856 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-dlncf"] Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.655955 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jj9tv" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.671151 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-km6m5" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.678182 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bcpmt" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.685957 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-5hjq7" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.691240 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.700720 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kjbd" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.718661 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:34 crc kubenswrapper[5036]: E0110 16:28:34.719196 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:35.219178124 +0000 UTC m=+37.089413618 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.752184 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-c8nxz" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.757054 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9r9hf"] Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.776540 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w6mpm" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.776846 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4qfgz" Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.822664 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:34 crc kubenswrapper[5036]: E0110 16:28:34.823464 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:35.323451545 +0000 UTC m=+37.193687039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.924951 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:34 crc kubenswrapper[5036]: E0110 16:28:34.925126 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:35.425098255 +0000 UTC m=+37.295333749 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.925345 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:34 crc kubenswrapper[5036]: E0110 16:28:34.925909 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:35.425888237 +0000 UTC m=+37.296123731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.957030 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-lx6q9" event={"ID":"05b7892c-5976-4209-821e-be876e2d43a1","Type":"ContainerStarted","Data":"8f7cc74114063858fa8c63f9df550774a99561f3e9adedde9f339a7c6b5554a1"} Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.983834 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnq8l"] Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.984433 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-v44cl" event={"ID":"374ac022-3179-463e-a9b9-6c9890a8baea","Type":"ContainerStarted","Data":"54b4067b93950aa2953d0092af3313798359c54b6c79d6bddd1a396d876e4145"} Jan 10 16:28:34 crc kubenswrapper[5036]: I0110 16:28:34.999504 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r5pns"] Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.003836 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8dts" event={"ID":"253187aa-7581-4eb5-ab49-bc4d53a47810","Type":"ContainerStarted","Data":"73800f2f9bee742c2dd73a44b0c40fdce1591c2a42396a01fdb452cc5b0a62aa"} Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.006453 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" event={"ID":"76a88401-7e1f-4e2d-accb-184ff7867211","Type":"ContainerStarted","Data":"cc9bd8587671af76282bdb74ec351bd9e8e19bbb184923fa16baaf6a5ca5fc93"} Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.027015 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:35 crc kubenswrapper[5036]: E0110 16:28:35.027888 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:35.527851224 +0000 UTC m=+37.398086718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.028089 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2rvph" event={"ID":"77b7a0ba-113f-4f0a-a6c5-f5850de92916","Type":"ContainerStarted","Data":"d8728dead7277f1e808f6bf1862b60ef5cf26815740d73cd2b325a70dfdc1e06"} Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.028358 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:35 crc kubenswrapper[5036]: E0110 16:28:35.032347 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:35.532330156 +0000 UTC m=+37.402565650 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.052303 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" event={"ID":"87b4bb91-70e1-44be-83a9-7b6adced3e51","Type":"ContainerStarted","Data":"260a57a9a7d43bed66a5e9c1cff25df5fc91c89a717e588e11e7814384539272"} Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.059177 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"6f310bc62d02a5e6632fc943bbe90de18251b869a6397433d6a427e9ad88c1ec"} Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.076590 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" event={"ID":"e5ea287e-5a20-4798-8f4b-4f2d0e5b1581","Type":"ContainerStarted","Data":"f2e43ef035148b716424bf58d83371843de23f90899e03d9effeaa90ce4e3c37"} Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.076663 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" event={"ID":"e5ea287e-5a20-4798-8f4b-4f2d0e5b1581","Type":"ContainerStarted","Data":"dfd8ebc1df64a9bb06bb11143df10f701fe3e2621e0c012af0da224fe04c2a1f"} Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.076968 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.079320 5036 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-b4wwd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.079381 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" podUID="e5ea287e-5a20-4798-8f4b-4f2d0e5b1581" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.084994 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-45j5v" event={"ID":"6b14e5d5-1b40-45f6-a5c6-c161eeade0f9","Type":"ContainerStarted","Data":"0c97b482a0efcf442d5731b0e60ee19b02d7a11192dfb1d6eaa0144b81f0f7d9"} Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.085045 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-45j5v" event={"ID":"6b14e5d5-1b40-45f6-a5c6-c161eeade0f9","Type":"ContainerStarted","Data":"9d3070c960eb99d6982dc55ac7f2398826353399f9dbd9a7b89fef4aae3b8453"} Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.089076 5036 patch_prober.go:28] interesting pod/router-default-5444994796-kcb5k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 10 16:28:35 crc kubenswrapper[5036]: [-]has-synced failed: reason withheld Jan 10 16:28:35 crc kubenswrapper[5036]: [+]process-running ok Jan 10 16:28:35 crc kubenswrapper[5036]: healthz check failed Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.089118 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcb5k" podUID="5620c8e3-4592-4189-b074-4ea40e9447ff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.136169 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:35 crc kubenswrapper[5036]: E0110 16:28:35.136561 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:35.636536025 +0000 UTC m=+37.506771509 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.136902 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:35 crc kubenswrapper[5036]: E0110 16:28:35.138994 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:35.638968821 +0000 UTC m=+37.509204505 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.144590 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zkxb5" event={"ID":"f6db2aeb-98ec-4b01-83b0-a0dc2816bf48","Type":"ContainerStarted","Data":"36a0dfe6d914d608932e691dcb6fee066f8c3fa65b015ea749407817d2edf28a"} Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.148294 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-dlncf" event={"ID":"ef797e07-14de-4b71-af82-bd8304e658dc","Type":"ContainerStarted","Data":"cef40a4e41285917c8f2a9447e7378f5d33cd87db66287a35b6e1b08aa4e56b7"} Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.166602 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vv7rp" event={"ID":"e847b6c6-710f-4a76-9887-bac022f8de18","Type":"ContainerStarted","Data":"e2862ae75f57e9acecccd163f94b42cc564398b4c4ebeb8610a17bb60ffa980e"} Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.239022 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:35 crc kubenswrapper[5036]: E0110 16:28:35.239251 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:35.739219533 +0000 UTC m=+37.609455027 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.240729 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:35 crc kubenswrapper[5036]: E0110 16:28:35.243535 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:35.743510529 +0000 UTC m=+37.613746053 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.342878 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:35 crc kubenswrapper[5036]: E0110 16:28:35.344057 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:35.844032179 +0000 UTC m=+37.714267673 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.449608 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:35 crc kubenswrapper[5036]: E0110 16:28:35.450039 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:35.950023376 +0000 UTC m=+37.820258880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.537835 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7lh8w"] Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.550240 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:35 crc kubenswrapper[5036]: E0110 16:28:35.550781 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:36.050759002 +0000 UTC m=+37.920994496 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.573921 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-z6hnf"] Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.614891 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rv7wh"] Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.649845 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pjrh2"] Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.649918 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptztt"] Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.652133 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:35 crc kubenswrapper[5036]: E0110 16:28:35.652667 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:36.152644648 +0000 UTC m=+38.022880142 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.755207 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:35 crc kubenswrapper[5036]: E0110 16:28:35.755477 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:36.255416928 +0000 UTC m=+38.125652422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.755542 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:35 crc kubenswrapper[5036]: E0110 16:28:35.756049 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:36.256032585 +0000 UTC m=+38.126268139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.858085 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:35 crc kubenswrapper[5036]: E0110 16:28:35.858616 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:36.35858982 +0000 UTC m=+38.228825314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:35 crc kubenswrapper[5036]: I0110 16:28:35.961835 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:35 crc kubenswrapper[5036]: E0110 16:28:35.962574 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:36.462561163 +0000 UTC m=+38.332796657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:35 crc kubenswrapper[5036]: W0110 16:28:35.980810 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacb54813_4c4d_4b94_9337_19541ac1980e.slice/crio-5654262ac197457986286514237a948849b176dcf9a6303a86b0ceaffac8acb4 WatchSource:0}: Error finding container 5654262ac197457986286514237a948849b176dcf9a6303a86b0ceaffac8acb4: Status 404 returned error can't find the container with id 5654262ac197457986286514237a948849b176dcf9a6303a86b0ceaffac8acb4 Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.064480 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.064748 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4ede2a2-1cff-4d29-8b81-16de7162b5fe-metrics-certs\") pod \"network-metrics-daemon-lzkzv\" (UID: \"b4ede2a2-1cff-4d29-8b81-16de7162b5fe\") " pod="openshift-multus/network-metrics-daemon-lzkzv" Jan 10 16:28:36 crc kubenswrapper[5036]: E0110 16:28:36.065451 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:36.565414105 +0000 UTC m=+38.435649599 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.084441 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2rvph" podStartSLOduration=18.084415519 podStartE2EDuration="18.084415519s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:36.082103457 +0000 UTC m=+37.952338951" watchObservedRunningTime="2026-01-10 16:28:36.084415519 +0000 UTC m=+37.954651013" Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.122439 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4ede2a2-1cff-4d29-8b81-16de7162b5fe-metrics-certs\") pod \"network-metrics-daemon-lzkzv\" (UID: \"b4ede2a2-1cff-4d29-8b81-16de7162b5fe\") " pod="openshift-multus/network-metrics-daemon-lzkzv" Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.130669 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zkxb5" podStartSLOduration=18.1306412 podStartE2EDuration="18.1306412s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:36.130613379 +0000 UTC m=+38.000848873" watchObservedRunningTime="2026-01-10 16:28:36.1306412 +0000 UTC m=+38.000876694" Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.133175 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-lx6q9" podStartSLOduration=18.133165658 podStartE2EDuration="18.133165658s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:36.110982288 +0000 UTC m=+37.981217782" watchObservedRunningTime="2026-01-10 16:28:36.133165658 +0000 UTC m=+38.003401152" Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.166051 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:36 crc kubenswrapper[5036]: E0110 16:28:36.166401 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:36.666389337 +0000 UTC m=+38.536624831 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.171988 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" podStartSLOduration=18.171856205 podStartE2EDuration="18.171856205s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:36.169212724 +0000 UTC m=+38.039448218" watchObservedRunningTime="2026-01-10 16:28:36.171856205 +0000 UTC m=+38.042091699" Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.199159 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjrh2" event={"ID":"51291205-9eaf-455b-aa3b-a261761c8c06","Type":"ContainerStarted","Data":"de17056ffa8622f6efbf1397941c57447f6cc5514af4d5ebec980a5a06a2dd04"} Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.201253 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-z6hnf" event={"ID":"515d0795-5463-4e54-b0d2-ee5b16994fa4","Type":"ContainerStarted","Data":"abe6db3a8b341f80dc75b83e3e3c220cbc47e91c9915b85fc31ab1ca1ae16cbe"} Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.206437 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5f0a03635531a72ecf0717be3a08ea0d63f5ad4eb5e5ebc2f93e49ee225edfdc"} Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.207134 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-kcb5k" podStartSLOduration=18.207116219 podStartE2EDuration="18.207116219s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:36.207023157 +0000 UTC m=+38.077258651" watchObservedRunningTime="2026-01-10 16:28:36.207116219 +0000 UTC m=+38.077351713" Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.212949 5036 generic.go:334] "Generic (PLEG): container finished" podID="76a88401-7e1f-4e2d-accb-184ff7867211" containerID="5b6722c23fda3f333069927e1f17c739449b713ae9fe8b14df06b401e02387f6" exitCode=0 Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.213006 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" event={"ID":"76a88401-7e1f-4e2d-accb-184ff7867211","Type":"ContainerDied","Data":"5b6722c23fda3f333069927e1f17c739449b713ae9fe8b14df06b401e02387f6"} Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.224780 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zkxb5" event={"ID":"f6db2aeb-98ec-4b01-83b0-a0dc2816bf48","Type":"ContainerStarted","Data":"9b0feda51de8808e1a59bf1ddcf8a37330a590f5070d3be7caaa98515858fcb1"} Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.230309 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-dlncf" event={"ID":"ef797e07-14de-4b71-af82-bd8304e658dc","Type":"ContainerStarted","Data":"9e068d4fad9115de72642762d64b46a1d3926b6d42da1dd0dfc464ea3e104530"} Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.232454 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-c8nxz" event={"ID":"1b66cca7-7c58-4f1f-9f81-63c9ff9825ca","Type":"ContainerStarted","Data":"bb6eaff43beeffd6c329a5ecb009dcbf5e8dbb736af66bae859ffe52dfded161"} Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.233995 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" event={"ID":"56edcbe7-428c-4373-928d-b2fdf97a0a3a","Type":"ContainerStarted","Data":"deb8a8d82bea9e19965b37e9d81eb86e91935c109a6478df9ca8bff7ff21cd6d"} Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.235239 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptztt" event={"ID":"acb54813-4c4d-4b94-9337-19541ac1980e","Type":"ContainerStarted","Data":"5654262ac197457986286514237a948849b176dcf9a6303a86b0ceaffac8acb4"} Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.236832 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" event={"ID":"87b4bb91-70e1-44be-83a9-7b6adced3e51","Type":"ContainerStarted","Data":"10a523a150199988fa9c1229061811decbedf09eb3488d0a97eeb5618b2f29f3"} Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.237116 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.239633 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" event={"ID":"792beb3d-c532-4c80-8ab7-3024b5db8512","Type":"ContainerStarted","Data":"e84d04a2fdff42c4d5d6844eaf4e32d6562610178581a08f7812e91c8f66191a"} Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.239667 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" event={"ID":"792beb3d-c532-4c80-8ab7-3024b5db8512","Type":"ContainerStarted","Data":"c53490abff74cec79201ea092b68350d6df5056bbfbb61e7ff7e66b78dcde75d"} Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.240038 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.244974 5036 generic.go:334] "Generic (PLEG): container finished" podID="e847b6c6-710f-4a76-9887-bac022f8de18" containerID="9fa083602d57918fbd44187cb4d649c354ea906b1aea4d117f2b53ac0563b92e" exitCode=0 Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.245084 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vv7rp" event={"ID":"e847b6c6-710f-4a76-9887-bac022f8de18","Type":"ContainerDied","Data":"9fa083602d57918fbd44187cb4d649c354ea906b1aea4d117f2b53ac0563b92e"} Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.249082 5036 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9r9hf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.249145 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" podUID="792beb3d-c532-4c80-8ab7-3024b5db8512" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.253890 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-v44cl" event={"ID":"374ac022-3179-463e-a9b9-6c9890a8baea","Type":"ContainerStarted","Data":"8411140d2093637bd56b83353e6e4e3b94f49f5742f7fa98b43f45733d927201"} Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.268222 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:36 crc kubenswrapper[5036]: E0110 16:28:36.269714 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:36.769673912 +0000 UTC m=+38.639909406 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.278701 5036 patch_prober.go:28] interesting pod/router-default-5444994796-kcb5k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 10 16:28:36 crc kubenswrapper[5036]: [-]has-synced failed: reason withheld Jan 10 16:28:36 crc kubenswrapper[5036]: [+]process-running ok Jan 10 16:28:36 crc kubenswrapper[5036]: healthz check failed Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.278763 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcb5k" podUID="5620c8e3-4592-4189-b074-4ea40e9447ff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.293482 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lzkzv" Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.311389 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"3263910cd59a6c43e1920d09f7658194e8f42585b361a57575a726a8788f53d5"} Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.338114 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" event={"ID":"ec9ab704-1f8b-473b-bbe2-3c09d04991cd","Type":"ContainerStarted","Data":"f31d291e7892c805fc0557dbecab0e14aa82a3b2bc98acffb873651a77e82b6c"} Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.341529 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-dlncf" podStartSLOduration=18.341514995 podStartE2EDuration="18.341514995s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:36.296508228 +0000 UTC m=+38.166743722" watchObservedRunningTime="2026-01-10 16:28:36.341514995 +0000 UTC m=+38.211750489" Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.376863 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8vgqk"] Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.378346 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:36 crc kubenswrapper[5036]: E0110 16:28:36.383624 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:36.883595164 +0000 UTC m=+38.753830658 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.444295 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.456339 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rv7wh" event={"ID":"615043bb-4f5a-497c-9d23-4ef7fe1b7ac8","Type":"ContainerStarted","Data":"b67fa2c39e6a8a73c4156f465aa9f073f1468d97b36996561564818de3f3c60c"} Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.488872 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:36 crc kubenswrapper[5036]: E0110 16:28:36.489999 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:36.989973852 +0000 UTC m=+38.860209336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.506727 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-45j5v" event={"ID":"6b14e5d5-1b40-45f6-a5c6-c161eeade0f9","Type":"ContainerStarted","Data":"9eb4ebad1cafb1d421122daef0466824fd32f8a77119ed3f3eb32e2ae342e983"} Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.510539 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" podStartSLOduration=18.510511117 podStartE2EDuration="18.510511117s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:36.433197616 +0000 UTC m=+38.303433110" watchObservedRunningTime="2026-01-10 16:28:36.510511117 +0000 UTC m=+38.380746611" Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.549572 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8dts" event={"ID":"253187aa-7581-4eb5-ab49-bc4d53a47810","Type":"ContainerStarted","Data":"dbddaaf8210cf7a63136289e815b75e56cb89b6d63f96337e07ab687e7a72d3e"} Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.553053 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" podStartSLOduration=18.553031268 podStartE2EDuration="18.553031268s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:36.512021788 +0000 UTC m=+38.382257282" watchObservedRunningTime="2026-01-10 16:28:36.553031268 +0000 UTC m=+38.423266762" Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.554405 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-45j5v" podStartSLOduration=18.554397735 podStartE2EDuration="18.554397735s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:36.552919605 +0000 UTC m=+38.423155099" watchObservedRunningTime="2026-01-10 16:28:36.554397735 +0000 UTC m=+38.424633229" Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.565306 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r5pns" event={"ID":"2a27702d-fd8a-4b89-883e-a2250c0cb1a9","Type":"ContainerStarted","Data":"93850e35e3c94618dba6953dee671b5ffe50e7b07838986437d4f42413e2ce79"} Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.591607 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:36 crc kubenswrapper[5036]: E0110 16:28:36.592468 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:37.092454394 +0000 UTC m=+38.962689888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.596430 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnq8l" event={"ID":"09ff12a0-dcd3-465b-a051-1f0216f9ba57","Type":"ContainerStarted","Data":"27b138cbfe74041880b3ba84c3e4dd38a83111bcfbde5234eda2b5a24df6ffe8"} Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.604666 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.672791 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-s8dts" podStartSLOduration=18.672774607 podStartE2EDuration="18.672774607s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:36.670530577 +0000 UTC m=+38.540766071" watchObservedRunningTime="2026-01-10 16:28:36.672774607 +0000 UTC m=+38.543010101" Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.701437 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:36 crc kubenswrapper[5036]: E0110 16:28:36.702010 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:37.201977297 +0000 UTC m=+39.072212791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.702150 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:36 crc kubenswrapper[5036]: E0110 16:28:36.703550 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:37.203530969 +0000 UTC m=+39.073766463 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.805078 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:36 crc kubenswrapper[5036]: E0110 16:28:36.805610 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:37.30557689 +0000 UTC m=+39.175812384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.855222 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g9f5x"] Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.864259 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2wm7"] Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.889279 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdtsj"] Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.889346 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pc2wp"] Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.892485 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckc7q"] Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.895815 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zs55l"] Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.910496 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:36 crc kubenswrapper[5036]: E0110 16:28:36.910839 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:37.410828188 +0000 UTC m=+39.281063682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.914067 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lr2qm"] Jan 10 16:28:36 crc kubenswrapper[5036]: W0110 16:28:36.949125 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9274cc77_f1fa_4169_8bb6_9ba783c69440.slice/crio-4c247a3dc137804449a7e545265dc67ba973809086dc6fbddf4649c8113fe107 WatchSource:0}: Error finding container 4c247a3dc137804449a7e545265dc67ba973809086dc6fbddf4649c8113fe107: Status 404 returned error can't find the container with id 4c247a3dc137804449a7e545265dc67ba973809086dc6fbddf4649c8113fe107 Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.977964 5036 patch_prober.go:28] interesting pod/router-default-5444994796-kcb5k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 10 16:28:36 crc kubenswrapper[5036]: [-]has-synced failed: reason withheld Jan 10 16:28:36 crc kubenswrapper[5036]: [+]process-running ok Jan 10 16:28:36 crc kubenswrapper[5036]: healthz check failed Jan 10 16:28:36 crc kubenswrapper[5036]: I0110 16:28:36.978038 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcb5k" podUID="5620c8e3-4592-4189-b074-4ea40e9447ff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.015358 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:37 crc kubenswrapper[5036]: E0110 16:28:37.015869 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:37.515829939 +0000 UTC m=+39.386065433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.016527 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-bvg6n"] Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.028976 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-65jzk"] Jan 10 16:28:37 crc kubenswrapper[5036]: W0110 16:28:37.056100 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1b7d7be_9cb2_4817_89a0_ae511aa199ea.slice/crio-ca3dc0ff9bb0c2355943d4c75c1b9e2f6c8b40d30f488bea674717a453188216 WatchSource:0}: Error finding container ca3dc0ff9bb0c2355943d4c75c1b9e2f6c8b40d30f488bea674717a453188216: Status 404 returned error can't find the container with id ca3dc0ff9bb0c2355943d4c75c1b9e2f6c8b40d30f488bea674717a453188216 Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.063809 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8k59b"] Jan 10 16:28:37 crc kubenswrapper[5036]: W0110 16:28:37.077341 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77f50c9f_6757_4d97_afae_152cd032f789.slice/crio-175b83e2f634c75989d3f847c69566537ac6fb86f30184c6d4fb2db3f8bfa6f0 WatchSource:0}: Error finding container 175b83e2f634c75989d3f847c69566537ac6fb86f30184c6d4fb2db3f8bfa6f0: Status 404 returned error can't find the container with id 175b83e2f634c75989d3f847c69566537ac6fb86f30184c6d4fb2db3f8bfa6f0 Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.118527 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:37 crc kubenswrapper[5036]: E0110 16:28:37.118844 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:37.618832395 +0000 UTC m=+39.489067889 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.158368 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-lc5jj"] Jan 10 16:28:37 crc kubenswrapper[5036]: W0110 16:28:37.158425 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0bc40ca_fd61_4885_871b_3a7964df225a.slice/crio-bb28514038fe02be6d06577a766557dc53bd20904fe20e2f017c1728a586a8b4 WatchSource:0}: Error finding container bb28514038fe02be6d06577a766557dc53bd20904fe20e2f017c1728a586a8b4: Status 404 returned error can't find the container with id bb28514038fe02be6d06577a766557dc53bd20904fe20e2f017c1728a586a8b4 Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.201137 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jj9tv"] Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.224534 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:37 crc kubenswrapper[5036]: E0110 16:28:37.224964 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:37.724943796 +0000 UTC m=+39.595179290 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.239257 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w6mpm"] Jan 10 16:28:37 crc kubenswrapper[5036]: W0110 16:28:37.265673 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79a5550c_f653_48bd_9199_53843401d87e.slice/crio-465e31dd278a4f03079056698a67d7fc78c0a3bfd71392b5aad12dd230b37bb8 WatchSource:0}: Error finding container 465e31dd278a4f03079056698a67d7fc78c0a3bfd71392b5aad12dd230b37bb8: Status 404 returned error can't find the container with id 465e31dd278a4f03079056698a67d7fc78c0a3bfd71392b5aad12dd230b37bb8 Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.293804 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29467695-kv4q7"] Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.293865 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-5hjq7"] Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.332291 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:37 crc kubenswrapper[5036]: E0110 16:28:37.332786 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:37.832763493 +0000 UTC m=+39.702998987 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.349569 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bcpmt"] Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.355448 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lzkzv"] Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.380210 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-2kjbd"] Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.401248 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4qfgz"] Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.404340 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-km6m5"] Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.436889 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:37 crc kubenswrapper[5036]: E0110 16:28:37.437020 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:37.936983433 +0000 UTC m=+39.807218927 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.437289 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:37 crc kubenswrapper[5036]: E0110 16:28:37.441465 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:37.937665941 +0000 UTC m=+39.807901435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.514109 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.514325 5036 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.543108 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:37 crc kubenswrapper[5036]: E0110 16:28:37.543536 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:38.043514105 +0000 UTC m=+39.913749599 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.558060 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.645832 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:37 crc kubenswrapper[5036]: E0110 16:28:37.646853 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:38.14682582 +0000 UTC m=+40.017061314 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.661841 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-5hjq7" event={"ID":"d17b6c89-3a6e-4b30-9363-47232ae52829","Type":"ContainerStarted","Data":"4b638fff8bc3663b2ba7e9d0877f0c7953ad883bcab14108029b356cb8ca5dc5"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.684609 5036 generic.go:334] "Generic (PLEG): container finished" podID="ec9ab704-1f8b-473b-bbe2-3c09d04991cd" containerID="6117b9537f77d8b02b8b45d0c485628bceee2e739674572e24cc272083476112" exitCode=0 Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.685405 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" event={"ID":"ec9ab704-1f8b-473b-bbe2-3c09d04991cd","Type":"ContainerDied","Data":"6117b9537f77d8b02b8b45d0c485628bceee2e739674572e24cc272083476112"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.699737 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2wm7" event={"ID":"578a5db9-0ce8-4eda-8c50-c779f29f817f","Type":"ContainerStarted","Data":"7eb08ebae97aa0de9a674c9c82a6e38660571fab47d2c9ef8e8854664166d5bd"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.708361 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-z6hnf" event={"ID":"515d0795-5463-4e54-b0d2-ee5b16994fa4","Type":"ContainerStarted","Data":"ac21d93a4ef0f6d58644c61fa1269711bd558d82892b1ecd622147a46f286b38"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.739010 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lzkzv" event={"ID":"b4ede2a2-1cff-4d29-8b81-16de7162b5fe","Type":"ContainerStarted","Data":"c2653a0dfda60082a69fff64fc94729b2ae383e0b05eab055adcebf4c5207660"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.741471 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w6mpm" event={"ID":"0e8b604d-42ac-4972-affe-5abed1bf54d5","Type":"ContainerStarted","Data":"b897dd45d06d7e0eaae3372b6ff2f9434bd7bd5fbdb65514a41ca703ca97cd6b"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.748438 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:37 crc kubenswrapper[5036]: E0110 16:28:37.748961 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:38.248933313 +0000 UTC m=+40.119168807 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.753209 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" event={"ID":"a0bc40ca-fd61-4885-871b-3a7964df225a","Type":"ContainerStarted","Data":"bb28514038fe02be6d06577a766557dc53bd20904fe20e2f017c1728a586a8b4"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.766764 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-z6hnf" podStartSLOduration=19.766737024 podStartE2EDuration="19.766737024s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:37.765076609 +0000 UTC m=+39.635312103" watchObservedRunningTime="2026-01-10 16:28:37.766737024 +0000 UTC m=+39.636972518" Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.780521 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g9f5x" event={"ID":"9274cc77-f1fa-4169-8bb6-9ba783c69440","Type":"ContainerStarted","Data":"4c247a3dc137804449a7e545265dc67ba973809086dc6fbddf4649c8113fe107"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.783158 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zs55l" event={"ID":"77f50c9f-6757-4d97-afae-152cd032f789","Type":"ContainerStarted","Data":"175b83e2f634c75989d3f847c69566537ac6fb86f30184c6d4fb2db3f8bfa6f0"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.793155 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"de214d5b3c037f4bbd3b198756036c30cbaba0f51ddd4d5805f4133012b2331a"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.793919 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.798898 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8vgqk" event={"ID":"88184b8f-9aed-4978-bfbb-7054dd96550e","Type":"ContainerStarted","Data":"2e632d597ae917984bd93afccc6550ffe23a46255e62348293509b587cde159f"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.798966 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8vgqk" event={"ID":"88184b8f-9aed-4978-bfbb-7054dd96550e","Type":"ContainerStarted","Data":"a5ad6c0fedc032dad707304ba700bd9c20ae118487cb563b78d0a84338a8a4af"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.798991 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8vgqk" event={"ID":"88184b8f-9aed-4978-bfbb-7054dd96550e","Type":"ContainerStarted","Data":"8ad3a9444cf7c13e6cd8777eb64ad22240d26ced3ed1f1b366c6785ba36d1aee"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.813506 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptztt" event={"ID":"acb54813-4c4d-4b94-9337-19541ac1980e","Type":"ContainerStarted","Data":"65540a959d6fec9e2329f74acda232ccffc0753b6def9fc188ac178f6fb2f00d"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.816058 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-km6m5" event={"ID":"22345551-25b1-48ef-8bfb-c4b4c10170fd","Type":"ContainerStarted","Data":"22deca2636864c5559be37e52968c5836ce37efda9c012d6c4433eaf520a716d"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.822881 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" event={"ID":"79a5550c-f653-48bd-9199-53843401d87e","Type":"ContainerStarted","Data":"465e31dd278a4f03079056698a67d7fc78c0a3bfd71392b5aad12dd230b37bb8"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.839886 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" event={"ID":"56edcbe7-428c-4373-928d-b2fdf97a0a3a","Type":"ContainerStarted","Data":"a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.840716 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.850123 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:37 crc kubenswrapper[5036]: E0110 16:28:37.850868 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:38.35085316 +0000 UTC m=+40.221088654 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.883015 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7205e60df6fbc9bb6c955c0eaed3cc05b35de0e7a51ae01f3b626ca862c0ffb0"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.883079 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"df8f17a4d6022f3961efa23c51874e90af13c2e68b923c45c49ee823288cfcd0"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.887106 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-c8nxz" event={"ID":"1b66cca7-7c58-4f1f-9f81-63c9ff9825ca","Type":"ContainerStarted","Data":"692c016fa0359137f7b2b3f56f058b3d06d6fdfb5cbf71fb645252c8dc6b16e8"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.894118 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29467695-kv4q7" event={"ID":"8ee18389-eb4f-4c7b-98bf-2f9785f21ce4","Type":"ContainerStarted","Data":"e30d4a6522341d415e7b16900762a8b1b702e55abd5aafe38dc1153573fe767a"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.908902 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptztt" podStartSLOduration=19.9088867 podStartE2EDuration="19.9088867s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:37.907345398 +0000 UTC m=+39.777580882" watchObservedRunningTime="2026-01-10 16:28:37.9088867 +0000 UTC m=+39.779122194" Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.912619 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bcpmt" event={"ID":"900efb57-f66f-4bd4-a99c-542aebd6b412","Type":"ContainerStarted","Data":"ab7061acfa3a98e8b57c0a8fdb8fd2302d2b362feb87d36d54ba25ec30da447a"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.914757 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r5pns" event={"ID":"2a27702d-fd8a-4b89-883e-a2250c0cb1a9","Type":"ContainerStarted","Data":"86b8d06f294413a0f3f6f561b5b0609d09b1346d06dc2fe9b560e439dad38e4a"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.919890 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-v44cl" event={"ID":"374ac022-3179-463e-a9b9-6c9890a8baea","Type":"ContainerStarted","Data":"1f1a2a89cff944bdb720d895f7ead9eb8f520ea1ba22ffbd9591d4c11730c01c"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.960566 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:37 crc kubenswrapper[5036]: E0110 16:28:37.962366 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:38.462349767 +0000 UTC m=+40.332585261 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.975531 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.983299 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bvg6n" event={"ID":"d1559e8b-1a4d-4929-80cc-235f23048dd6","Type":"ContainerStarted","Data":"8e59b598f5cad56eec6dda20ae3a2de67421836d77c3daf83c09affa4e787dbe"} Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.988029 5036 patch_prober.go:28] interesting pod/router-default-5444994796-kcb5k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 10 16:28:37 crc kubenswrapper[5036]: [-]has-synced failed: reason withheld Jan 10 16:28:37 crc kubenswrapper[5036]: [+]process-running ok Jan 10 16:28:37 crc kubenswrapper[5036]: healthz check failed Jan 10 16:28:37 crc kubenswrapper[5036]: I0110 16:28:37.988141 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcb5k" podUID="5620c8e3-4592-4189-b074-4ea40e9447ff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.034963 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" event={"ID":"76a88401-7e1f-4e2d-accb-184ff7867211","Type":"ContainerStarted","Data":"a1f0cdbd92e9519c30f9297ac6784489bb87f08c6081310a9b1a32d2770bd953"} Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.055396 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kjbd" event={"ID":"34ba934a-3f28-4814-988e-f75e79084c14","Type":"ContainerStarted","Data":"06fa4b8b6c23fa9482dcbdfd63cc5adf41dedd5bab3d2ae370dd059d35e5630e"} Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.062737 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:38 crc kubenswrapper[5036]: E0110 16:28:38.064731 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:38.564715676 +0000 UTC m=+40.434951170 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.089381 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" podStartSLOduration=7.089363253 podStartE2EDuration="7.089363253s" podCreationTimestamp="2026-01-10 16:28:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:38.086041343 +0000 UTC m=+39.956276827" watchObservedRunningTime="2026-01-10 16:28:38.089363253 +0000 UTC m=+39.959598747" Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.150144 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-65jzk" event={"ID":"6bf48c2a-e148-4c19-8fa7-d60550edaad5","Type":"ContainerStarted","Data":"390e9a4c9e2da92b64aa774da581ac987105d8eb070708b8ff93ed91cbd4b388"} Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.167889 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8vgqk" podStartSLOduration=20.167868287 podStartE2EDuration="20.167868287s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:38.166482079 +0000 UTC m=+40.036717573" watchObservedRunningTime="2026-01-10 16:28:38.167868287 +0000 UTC m=+40.038103781" Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.169384 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:38 crc kubenswrapper[5036]: E0110 16:28:38.170484 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:38.670442217 +0000 UTC m=+40.540677711 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.182066 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rv7wh" event={"ID":"615043bb-4f5a-497c-9d23-4ef7fe1b7ac8","Type":"ContainerStarted","Data":"afd1c497df78c9825762d5a16bc35affcb5cc58a6068cb222c76017ed92f1543"} Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.227334 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vv7rp" event={"ID":"e847b6c6-710f-4a76-9887-bac022f8de18","Type":"ContainerStarted","Data":"97d39afda84960dceba58b6464a5e95deb3fe72954fd03d76a4ae89657313fb7"} Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.228134 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vv7rp" Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.256052 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lr2qm" event={"ID":"21dc0ffe-3f39-4c16-8b98-8bb475342db9","Type":"ContainerStarted","Data":"5e9192d06c247ba5b56943f10dcde8ef956cae0ce96bab23032c71b0129eba2c"} Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.273793 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:38 crc kubenswrapper[5036]: E0110 16:28:38.275487 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:38.775472108 +0000 UTC m=+40.645707592 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.285831 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-v44cl" podStartSLOduration=20.285803598 podStartE2EDuration="20.285803598s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:38.276514296 +0000 UTC m=+40.146749790" watchObservedRunningTime="2026-01-10 16:28:38.285803598 +0000 UTC m=+40.156039092" Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.293097 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjrh2" event={"ID":"51291205-9eaf-455b-aa3b-a261761c8c06","Type":"ContainerStarted","Data":"c2ba3794f75adb0645126b5d824a7c7993217ca03d8519c4571aaa8c36739320"} Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.293171 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjrh2" event={"ID":"51291205-9eaf-455b-aa3b-a261761c8c06","Type":"ContainerStarted","Data":"754eb01de7fad69c389a60337a19fb97167f0a436e204d7251e8678ed6d57063"} Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.337090 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnq8l" event={"ID":"09ff12a0-dcd3-465b-a051-1f0216f9ba57","Type":"ContainerStarted","Data":"0f71f78799f652ea18e39426f3186d5352ab601f1db9579c1f7ee3dc4753b27d"} Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.337159 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnq8l" event={"ID":"09ff12a0-dcd3-465b-a051-1f0216f9ba57","Type":"ContainerStarted","Data":"9bff1eac65aaab27807a63a4f417930af5973320dc16ccb838205c6478d45ef8"} Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.356928 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckc7q" event={"ID":"adb37ef9-7f35-4580-929d-0883cc3ca91a","Type":"ContainerStarted","Data":"5fdc22c5c8f86ce55d46c3ba3a9567ba1328dbd1afda1c4fb4b1297f7a8a2d5c"} Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.358112 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckc7q" Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.377912 5036 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-ckc7q container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.378008 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckc7q" podUID="adb37ef9-7f35-4580-929d-0883cc3ca91a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.379189 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:38 crc kubenswrapper[5036]: E0110 16:28:38.380550 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:38.88052694 +0000 UTC m=+40.750762434 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.383443 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdtsj" event={"ID":"a1b7d7be-9cb2-4817-89a0-ae511aa199ea","Type":"ContainerStarted","Data":"ca3dc0ff9bb0c2355943d4c75c1b9e2f6c8b40d30f488bea674717a453188216"} Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.385049 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdtsj" Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.403873 5036 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-zdtsj container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:5443/healthz\": dial tcp 10.217.0.17:5443: connect: connection refused" start-of-body= Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.403951 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdtsj" podUID="a1b7d7be-9cb2-4817-89a0-ae511aa199ea" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.17:5443/healthz\": dial tcp 10.217.0.17:5443: connect: connection refused" Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.409849 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-pc2wp" event={"ID":"5918ea44-c357-430a-90be-72cc7da6348f","Type":"ContainerStarted","Data":"ba19702fa814db628a10f2f59b6c827f95e5466b5656f2ecab01863f07ebce18"} Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.422234 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jj9tv" event={"ID":"eea6717e-7414-40f3-80c9-92838e761eba","Type":"ContainerStarted","Data":"c73c6b136cb7243b229fe74ebab78c27f8e6f4f729df3d9a6cb1d88298632523"} Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.438227 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.485418 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:38 crc kubenswrapper[5036]: E0110 16:28:38.490734 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:38.990716541 +0000 UTC m=+40.860952035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.508918 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" podStartSLOduration=20.508890873 podStartE2EDuration="20.508890873s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:38.471663876 +0000 UTC m=+40.341899370" watchObservedRunningTime="2026-01-10 16:28:38.508890873 +0000 UTC m=+40.379126367" Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.552850 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-c8nxz" podStartSLOduration=7.552827192 podStartE2EDuration="7.552827192s" podCreationTimestamp="2026-01-10 16:28:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:38.551951848 +0000 UTC m=+40.422187342" watchObservedRunningTime="2026-01-10 16:28:38.552827192 +0000 UTC m=+40.423062686" Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.578220 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-r5pns" podStartSLOduration=20.578198108 podStartE2EDuration="20.578198108s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:38.578134026 +0000 UTC m=+40.448369520" watchObservedRunningTime="2026-01-10 16:28:38.578198108 +0000 UTC m=+40.448433602" Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.593995 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:38 crc kubenswrapper[5036]: E0110 16:28:38.597213 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:39.097183481 +0000 UTC m=+40.967418975 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.647497 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vv7rp" podStartSLOduration=20.647460161 podStartE2EDuration="20.647460161s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:38.647201244 +0000 UTC m=+40.517436728" watchObservedRunningTime="2026-01-10 16:28:38.647460161 +0000 UTC m=+40.517695655" Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.695819 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rv7wh" podStartSLOduration=20.695785409 podStartE2EDuration="20.695785409s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:38.683040034 +0000 UTC m=+40.553275538" watchObservedRunningTime="2026-01-10 16:28:38.695785409 +0000 UTC m=+40.566020903" Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.699986 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:38 crc kubenswrapper[5036]: E0110 16:28:38.701366 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:39.201351979 +0000 UTC m=+41.071587463 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.801562 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:38 crc kubenswrapper[5036]: E0110 16:28:38.801926 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:39.301900289 +0000 UTC m=+41.172135783 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.802276 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:38 crc kubenswrapper[5036]: E0110 16:28:38.802725 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:39.302717171 +0000 UTC m=+41.172952665 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.889431 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckc7q" podStartSLOduration=20.889400837 podStartE2EDuration="20.889400837s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:38.889374196 +0000 UTC m=+40.759609690" watchObservedRunningTime="2026-01-10 16:28:38.889400837 +0000 UTC m=+40.759636331" Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.890788 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fnq8l" podStartSLOduration=20.890780334 podStartE2EDuration="20.890780334s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:38.840385251 +0000 UTC m=+40.710620735" watchObservedRunningTime="2026-01-10 16:28:38.890780334 +0000 UTC m=+40.761015828" Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.903436 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:38 crc kubenswrapper[5036]: E0110 16:28:38.903978 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:39.403958581 +0000 UTC m=+41.274194085 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.971291 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pjrh2" podStartSLOduration=20.971272072 podStartE2EDuration="20.971272072s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:38.923777097 +0000 UTC m=+40.794012591" watchObservedRunningTime="2026-01-10 16:28:38.971272072 +0000 UTC m=+40.841507566" Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.977578 5036 patch_prober.go:28] interesting pod/router-default-5444994796-kcb5k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 10 16:28:38 crc kubenswrapper[5036]: [-]has-synced failed: reason withheld Jan 10 16:28:38 crc kubenswrapper[5036]: [+]process-running ok Jan 10 16:28:38 crc kubenswrapper[5036]: healthz check failed Jan 10 16:28:38 crc kubenswrapper[5036]: I0110 16:28:38.977645 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcb5k" podUID="5620c8e3-4592-4189-b074-4ea40e9447ff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.012386 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:39 crc kubenswrapper[5036]: E0110 16:28:39.012938 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:39.512923269 +0000 UTC m=+41.383158763 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.091238 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdtsj" podStartSLOduration=21.091219697 podStartE2EDuration="21.091219697s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:38.972563977 +0000 UTC m=+40.842799471" watchObservedRunningTime="2026-01-10 16:28:39.091219697 +0000 UTC m=+40.961455181" Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.120447 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:39 crc kubenswrapper[5036]: E0110 16:28:39.121447 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:39.621430214 +0000 UTC m=+41.491665708 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.200857 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vv7rp" Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.222481 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:39 crc kubenswrapper[5036]: E0110 16:28:39.222809 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:39.722796727 +0000 UTC m=+41.593032221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.293154 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-lt5rc"] Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.327835 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:39 crc kubenswrapper[5036]: E0110 16:28:39.328226 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:39.828211709 +0000 UTC m=+41.698447193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.429805 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:39 crc kubenswrapper[5036]: E0110 16:28:39.430628 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:39.930612179 +0000 UTC m=+41.800847673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.500425 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29467695-kv4q7" event={"ID":"8ee18389-eb4f-4c7b-98bf-2f9785f21ce4","Type":"ContainerStarted","Data":"27662017b13517331e40f4940e69f8313a92f2b48d53a21db24568d97f34793a"} Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.531943 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:39 crc kubenswrapper[5036]: E0110 16:28:39.533107 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:40.033084811 +0000 UTC m=+41.903320305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.553373 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-z6hnf" event={"ID":"515d0795-5463-4e54-b0d2-ee5b16994fa4","Type":"ContainerStarted","Data":"8b910a49ceb452f9e6b09929c764608124587c7764558d2ad8affc3e4d8e29a9"} Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.619521 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-pc2wp" event={"ID":"5918ea44-c357-430a-90be-72cc7da6348f","Type":"ContainerStarted","Data":"4a2172a10a5389959fe3688fbd9c0ff8f5446da82a0396e8a9b9e50faf5b5786"} Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.635832 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:39 crc kubenswrapper[5036]: E0110 16:28:39.637012 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:40.136997193 +0000 UTC m=+42.007232687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.655305 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zs55l" event={"ID":"77f50c9f-6757-4d97-afae-152cd032f789","Type":"ContainerStarted","Data":"ce6c683b081b76695db5bc2f89a485ea4360db701d67499dc70fd9381db9341a"} Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.681135 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdtsj" event={"ID":"a1b7d7be-9cb2-4817-89a0-ae511aa199ea","Type":"ContainerStarted","Data":"9937567524ed18e880a2c973d7ccd4cbdb8f9f47ec29bfa0adbdd23199809973"} Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.693142 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g9f5x" event={"ID":"9274cc77-f1fa-4169-8bb6-9ba783c69440","Type":"ContainerStarted","Data":"a3715519dcc9ec53223f7565e0155b8081747887a2705fc6ad8e73be8eb8a92d"} Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.695588 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g9f5x" Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.737628 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:39 crc kubenswrapper[5036]: E0110 16:28:39.738721 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:40.238706165 +0000 UTC m=+42.108941659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.786590 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g9f5x" Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.790067 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kjbd" event={"ID":"34ba934a-3f28-4814-988e-f75e79084c14","Type":"ContainerStarted","Data":"d890ee6e8761c2b04775ddf8195a36aa4e3dd08398f22cd9f17649837a5c41dc"} Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.810616 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" event={"ID":"a0bc40ca-fd61-4885-871b-3a7964df225a","Type":"ContainerStarted","Data":"6b87c309cac13c46edddfa5e185e84be2b2002e281913a2591d10279b59e7b6d"} Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.811798 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.815832 5036 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8k59b container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.815884 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" podUID="a0bc40ca-fd61-4885-871b-3a7964df225a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.829650 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2wm7" event={"ID":"578a5db9-0ce8-4eda-8c50-c779f29f817f","Type":"ContainerStarted","Data":"d65de1c7e54a7f236d747e59d771bf8c3ff8c85d94d0180dbd4733f0b3440d24"} Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.841377 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:39 crc kubenswrapper[5036]: E0110 16:28:39.842818 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:40.342801281 +0000 UTC m=+42.213036775 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.858850 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-5hjq7" event={"ID":"d17b6c89-3a6e-4b30-9363-47232ae52829","Type":"ContainerStarted","Data":"1203efaec43cc4fcc06ce59f35a4fac8b908472bc036aef3a11bceb0983a2a7e"} Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.859717 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-5hjq7" Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.863901 5036 patch_prober.go:28] interesting pod/console-operator-58897d9998-5hjq7 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/readyz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.863978 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-5hjq7" podUID="d17b6c89-3a6e-4b30-9363-47232ae52829" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/readyz\": dial tcp 10.217.0.34:8443: connect: connection refused" Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.877105 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bcpmt" event={"ID":"900efb57-f66f-4bd4-a99c-542aebd6b412","Type":"ContainerStarted","Data":"0f89204f7c728a0c358bb11e6b78e784246396f877eec7f38d68c1260ef710e0"} Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.934858 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-65jzk" event={"ID":"6bf48c2a-e148-4c19-8fa7-d60550edaad5","Type":"ContainerStarted","Data":"540a62c512613b1dae65b8f51eb26a8d09c7d019c1dd310531507ece8d3dcb38"} Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.943055 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:39 crc kubenswrapper[5036]: E0110 16:28:39.945459 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:40.445440078 +0000 UTC m=+42.315675572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.972486 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bvg6n" event={"ID":"d1559e8b-1a4d-4929-80cc-235f23048dd6","Type":"ContainerStarted","Data":"b7327ce6e4431da8f55172479aa4b06c3729180dd85ce0cd8223bc14a304c2f9"} Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.976257 5036 patch_prober.go:28] interesting pod/router-default-5444994796-kcb5k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 10 16:28:39 crc kubenswrapper[5036]: [-]has-synced failed: reason withheld Jan 10 16:28:39 crc kubenswrapper[5036]: [+]process-running ok Jan 10 16:28:39 crc kubenswrapper[5036]: healthz check failed Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.976299 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcb5k" podUID="5620c8e3-4592-4189-b074-4ea40e9447ff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.992696 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lr2qm" event={"ID":"21dc0ffe-3f39-4c16-8b98-8bb475342db9","Type":"ContainerStarted","Data":"10bbf5c7b04da36d30ef7729bfb2c1a0eb0df646efc91edbfbfcb6092a859159"} Jan 10 16:28:39 crc kubenswrapper[5036]: I0110 16:28:39.993554 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lr2qm" Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.004333 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4qfgz" event={"ID":"6f0c5524-3a8c-4524-adec-7c1b61c6feaa","Type":"ContainerStarted","Data":"1e150258472f149dfa9b407892291d4bbce8c88c58c8b7005d6857a64fad40f3"} Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.004390 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4qfgz" event={"ID":"6f0c5524-3a8c-4524-adec-7c1b61c6feaa","Type":"ContainerStarted","Data":"22587cd35d7fce2fe08937025235f50a3a5c6b8d5edc0d0ce7704b580f43a79a"} Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.046920 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:40 crc kubenswrapper[5036]: E0110 16:28:40.049258 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:40.549245166 +0000 UTC m=+42.419480660 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.053167 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-km6m5" Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.073852 5036 patch_prober.go:28] interesting pod/downloads-7954f5f757-km6m5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.073926 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-km6m5" podUID="22345551-25b1-48ef-8bfb-c4b4c10170fd" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.074359 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jj9tv" event={"ID":"eea6717e-7414-40f3-80c9-92838e761eba","Type":"ContainerStarted","Data":"29f75badf3cb7c6ff78c72833c98594040368dac905474f31c2746e5b180fb2f"} Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.074412 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jj9tv" event={"ID":"eea6717e-7414-40f3-80c9-92838e761eba","Type":"ContainerStarted","Data":"c3f3bc6b890f5100fd085f1359db888ea5896533c678935ec16081a2d6fc7597"} Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.115791 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckc7q" event={"ID":"adb37ef9-7f35-4580-929d-0883cc3ca91a","Type":"ContainerStarted","Data":"024c2d10670772267f2016b8786efd45d7b85b780a8ae29949704f757222ffe6"} Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.125994 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ckc7q" Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.149228 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:40 crc kubenswrapper[5036]: E0110 16:28:40.154809 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:40.654785301 +0000 UTC m=+42.525020795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.228803 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-bvg6n" podStartSLOduration=22.228776883 podStartE2EDuration="22.228776883s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:40.226329297 +0000 UTC m=+42.096564801" watchObservedRunningTime="2026-01-10 16:28:40.228776883 +0000 UTC m=+42.099012377" Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.258145 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:40 crc kubenswrapper[5036]: E0110 16:28:40.258951 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:40.758934149 +0000 UTC m=+42.629169643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.300166 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4qfgz" podStartSLOduration=9.300141933999999 podStartE2EDuration="9.300141934s" podCreationTimestamp="2026-01-10 16:28:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:40.286129855 +0000 UTC m=+42.156365359" watchObservedRunningTime="2026-01-10 16:28:40.300141934 +0000 UTC m=+42.170377428" Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.360310 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:40 crc kubenswrapper[5036]: E0110 16:28:40.360629 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:40.86061146 +0000 UTC m=+42.730846944 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.415959 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-pc2wp" podStartSLOduration=22.415939357 podStartE2EDuration="22.415939357s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:40.413355867 +0000 UTC m=+42.283591361" watchObservedRunningTime="2026-01-10 16:28:40.415939357 +0000 UTC m=+42.286174851" Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.424069 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-g9f5x" podStartSLOduration=22.424043716 podStartE2EDuration="22.424043716s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:40.363784176 +0000 UTC m=+42.234019690" watchObservedRunningTime="2026-01-10 16:28:40.424043716 +0000 UTC m=+42.294279210" Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.465107 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:40 crc kubenswrapper[5036]: E0110 16:28:40.465480 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:40.965468257 +0000 UTC m=+42.835703751 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.478763 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-5hjq7" podStartSLOduration=22.478744596 podStartE2EDuration="22.478744596s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:40.476802834 +0000 UTC m=+42.347038328" watchObservedRunningTime="2026-01-10 16:28:40.478744596 +0000 UTC m=+42.348980090" Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.575256 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:40 crc kubenswrapper[5036]: E0110 16:28:40.575626 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:41.075610977 +0000 UTC m=+42.945846471 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.609282 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-zs55l" podStartSLOduration=22.609252507 podStartE2EDuration="22.609252507s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:40.548125143 +0000 UTC m=+42.418360637" watchObservedRunningTime="2026-01-10 16:28:40.609252507 +0000 UTC m=+42.479488001" Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.613412 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zdtsj" Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.657394 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jj9tv" podStartSLOduration=22.657367289 podStartE2EDuration="22.657367289s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:40.656254669 +0000 UTC m=+42.526490163" watchObservedRunningTime="2026-01-10 16:28:40.657367289 +0000 UTC m=+42.527602783" Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.658042 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" podStartSLOduration=22.658036347 podStartE2EDuration="22.658036347s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:40.611137978 +0000 UTC m=+42.481373472" watchObservedRunningTime="2026-01-10 16:28:40.658036347 +0000 UTC m=+42.528271831" Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.678664 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:40 crc kubenswrapper[5036]: E0110 16:28:40.679061 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:41.179045765 +0000 UTC m=+43.049281259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.708875 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bcpmt" podStartSLOduration=22.708855432 podStartE2EDuration="22.708855432s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:40.704260157 +0000 UTC m=+42.574495651" watchObservedRunningTime="2026-01-10 16:28:40.708855432 +0000 UTC m=+42.579090926" Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.741047 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-km6m5" podStartSLOduration=22.741022882 podStartE2EDuration="22.741022882s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:40.740063636 +0000 UTC m=+42.610299130" watchObservedRunningTime="2026-01-10 16:28:40.741022882 +0000 UTC m=+42.611258376" Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.779725 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:40 crc kubenswrapper[5036]: E0110 16:28:40.780197 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:41.280178691 +0000 UTC m=+43.150414185 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.812341 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29467695-kv4q7" podStartSLOduration=21.812323861 podStartE2EDuration="21.812323861s" podCreationTimestamp="2026-01-10 16:28:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:40.777984482 +0000 UTC m=+42.648219976" watchObservedRunningTime="2026-01-10 16:28:40.812323861 +0000 UTC m=+42.682559355" Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.840908 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-2kjbd" podStartSLOduration=22.840890754 podStartE2EDuration="22.840890754s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:40.839052784 +0000 UTC m=+42.709288288" watchObservedRunningTime="2026-01-10 16:28:40.840890754 +0000 UTC m=+42.711126248" Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.868292 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lr2qm" podStartSLOduration=22.868274845 podStartE2EDuration="22.868274845s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:40.864694928 +0000 UTC m=+42.734930432" watchObservedRunningTime="2026-01-10 16:28:40.868274845 +0000 UTC m=+42.738510349" Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.883064 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:40 crc kubenswrapper[5036]: E0110 16:28:40.883483 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:41.383466766 +0000 UTC m=+43.253702260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.941762 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-65jzk" podStartSLOduration=22.941738082 podStartE2EDuration="22.941738082s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:40.940577641 +0000 UTC m=+42.810813135" watchObservedRunningTime="2026-01-10 16:28:40.941738082 +0000 UTC m=+42.811973576" Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.942133 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2wm7" podStartSLOduration=22.942127503000002 podStartE2EDuration="22.942127503s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:40.903092257 +0000 UTC m=+42.773327751" watchObservedRunningTime="2026-01-10 16:28:40.942127503 +0000 UTC m=+42.812362997" Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.962827 5036 patch_prober.go:28] interesting pod/router-default-5444994796-kcb5k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 10 16:28:40 crc kubenswrapper[5036]: [-]has-synced failed: reason withheld Jan 10 16:28:40 crc kubenswrapper[5036]: [+]process-running ok Jan 10 16:28:40 crc kubenswrapper[5036]: healthz check failed Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.962901 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcb5k" podUID="5620c8e3-4592-4189-b074-4ea40e9447ff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.983920 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:40 crc kubenswrapper[5036]: E0110 16:28:40.984205 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:41.48417174 +0000 UTC m=+43.354407234 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:40 crc kubenswrapper[5036]: I0110 16:28:40.984350 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:40 crc kubenswrapper[5036]: E0110 16:28:40.984749 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:41.484735135 +0000 UTC m=+43.354970629 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.086086 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:41 crc kubenswrapper[5036]: E0110 16:28:41.086476 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:41.586457178 +0000 UTC m=+43.456692672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.103246 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9w55c"] Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.104382 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w55c" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.114152 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.122282 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lzkzv" event={"ID":"b4ede2a2-1cff-4d29-8b81-16de7162b5fe","Type":"ContainerStarted","Data":"08ac56ccd9b7ccdfb8c1dcc4bc56ca931bae4afaea225bd93b1b593b193a080b"} Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.122338 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lzkzv" event={"ID":"b4ede2a2-1cff-4d29-8b81-16de7162b5fe","Type":"ContainerStarted","Data":"8054f42d6ed413bac154a2c4d778b8591b044369a3da6689fc3ff6db1024d11c"} Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.127717 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-km6m5" event={"ID":"22345551-25b1-48ef-8bfb-c4b4c10170fd","Type":"ContainerStarted","Data":"0d19733e00fd6ea76d23b1074267128bfd228e76fff96fdff9d91a911389f869"} Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.128402 5036 patch_prober.go:28] interesting pod/downloads-7954f5f757-km6m5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.128479 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-km6m5" podUID="22345551-25b1-48ef-8bfb-c4b4c10170fd" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.130982 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" event={"ID":"79a5550c-f653-48bd-9199-53843401d87e","Type":"ContainerStarted","Data":"238e436b1d45d8caad98ff2b94d5b53809c86e69ebb19120cc4bcfb92661708a"} Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.132725 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9w55c"] Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.133339 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w6mpm" event={"ID":"0e8b604d-42ac-4972-affe-5abed1bf54d5","Type":"ContainerStarted","Data":"482b530fa6074582cf13a0bc47d129d1a0a6a72666254d17f7230ea1a816f24f"} Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.133401 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w6mpm" event={"ID":"0e8b604d-42ac-4972-affe-5abed1bf54d5","Type":"ContainerStarted","Data":"62ef8b9993496a631da81ead14601403c52547db193e11aae9774e57cc47c96a"} Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.134277 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-w6mpm" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.139105 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" event={"ID":"ec9ab704-1f8b-473b-bbe2-3c09d04991cd","Type":"ContainerStarted","Data":"8764c013636833a06f3e6ab68eba1807aabcac52ee28785f52a9548c97707a5d"} Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.139160 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" event={"ID":"ec9ab704-1f8b-473b-bbe2-3c09d04991cd","Type":"ContainerStarted","Data":"94d012eec28c76a66ba532dd590cb29eebf64e42551fa9398f080b86695d8452"} Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.143872 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-65jzk" event={"ID":"6bf48c2a-e148-4c19-8fa7-d60550edaad5","Type":"ContainerStarted","Data":"ab833ee661a22bbf1c271450e8490aa0828da1d778223105c943332422f83f9f"} Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.147194 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lr2qm" event={"ID":"21dc0ffe-3f39-4c16-8b98-8bb475342db9","Type":"ContainerStarted","Data":"7e14330224b3e194b93b30d1996f1af1fe5589ce42301b21ec6ed7778546a0d4"} Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.147389 5036 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8k59b container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.147430 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" podUID="a0bc40ca-fd61-4885-871b-3a7964df225a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.158138 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" podUID="56edcbe7-428c-4373-928d-b2fdf97a0a3a" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62" gracePeriod=30 Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.208115 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:41 crc kubenswrapper[5036]: E0110 16:28:41.211847 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:41.711827299 +0000 UTC m=+43.582062793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.230120 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" podStartSLOduration=23.230104024 podStartE2EDuration="23.230104024s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:41.227105163 +0000 UTC m=+43.097340657" watchObservedRunningTime="2026-01-10 16:28:41.230104024 +0000 UTC m=+43.100339518" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.297214 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-w6mpm" podStartSLOduration=10.297191389 podStartE2EDuration="10.297191389s" podCreationTimestamp="2026-01-10 16:28:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:41.284817824 +0000 UTC m=+43.155053308" watchObservedRunningTime="2026-01-10 16:28:41.297191389 +0000 UTC m=+43.167426883" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.298001 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hm8ns"] Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.312122 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hm8ns" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.323666 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hm8ns"] Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.324234 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.338970 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.339300 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1513baef-e92c-4399-ae0f-b8fe4a738702-catalog-content\") pod \"certified-operators-hm8ns\" (UID: \"1513baef-e92c-4399-ae0f-b8fe4a738702\") " pod="openshift-marketplace/certified-operators-hm8ns" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.339351 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0239b380-03c8-455e-a981-2aaaae000828-catalog-content\") pod \"community-operators-9w55c\" (UID: \"0239b380-03c8-455e-a981-2aaaae000828\") " pod="openshift-marketplace/community-operators-9w55c" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.339371 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0239b380-03c8-455e-a981-2aaaae000828-utilities\") pod \"community-operators-9w55c\" (UID: \"0239b380-03c8-455e-a981-2aaaae000828\") " pod="openshift-marketplace/community-operators-9w55c" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.339420 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1513baef-e92c-4399-ae0f-b8fe4a738702-utilities\") pod \"certified-operators-hm8ns\" (UID: \"1513baef-e92c-4399-ae0f-b8fe4a738702\") " pod="openshift-marketplace/certified-operators-hm8ns" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.339479 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhh59\" (UniqueName: \"kubernetes.io/projected/0239b380-03c8-455e-a981-2aaaae000828-kube-api-access-jhh59\") pod \"community-operators-9w55c\" (UID: \"0239b380-03c8-455e-a981-2aaaae000828\") " pod="openshift-marketplace/community-operators-9w55c" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.339523 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q7f7\" (UniqueName: \"kubernetes.io/projected/1513baef-e92c-4399-ae0f-b8fe4a738702-kube-api-access-7q7f7\") pod \"certified-operators-hm8ns\" (UID: \"1513baef-e92c-4399-ae0f-b8fe4a738702\") " pod="openshift-marketplace/certified-operators-hm8ns" Jan 10 16:28:41 crc kubenswrapper[5036]: E0110 16:28:41.339646 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:41.839628287 +0000 UTC m=+43.709863781 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.373535 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-lzkzv" podStartSLOduration=23.373511964 podStartE2EDuration="23.373511964s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:41.373353689 +0000 UTC m=+43.243589173" watchObservedRunningTime="2026-01-10 16:28:41.373511964 +0000 UTC m=+43.243747458" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.443662 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.443744 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q7f7\" (UniqueName: \"kubernetes.io/projected/1513baef-e92c-4399-ae0f-b8fe4a738702-kube-api-access-7q7f7\") pod \"certified-operators-hm8ns\" (UID: \"1513baef-e92c-4399-ae0f-b8fe4a738702\") " pod="openshift-marketplace/certified-operators-hm8ns" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.443791 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1513baef-e92c-4399-ae0f-b8fe4a738702-catalog-content\") pod \"certified-operators-hm8ns\" (UID: \"1513baef-e92c-4399-ae0f-b8fe4a738702\") " pod="openshift-marketplace/certified-operators-hm8ns" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.443823 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0239b380-03c8-455e-a981-2aaaae000828-catalog-content\") pod \"community-operators-9w55c\" (UID: \"0239b380-03c8-455e-a981-2aaaae000828\") " pod="openshift-marketplace/community-operators-9w55c" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.443839 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0239b380-03c8-455e-a981-2aaaae000828-utilities\") pod \"community-operators-9w55c\" (UID: \"0239b380-03c8-455e-a981-2aaaae000828\") " pod="openshift-marketplace/community-operators-9w55c" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.443869 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1513baef-e92c-4399-ae0f-b8fe4a738702-utilities\") pod \"certified-operators-hm8ns\" (UID: \"1513baef-e92c-4399-ae0f-b8fe4a738702\") " pod="openshift-marketplace/certified-operators-hm8ns" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.443895 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhh59\" (UniqueName: \"kubernetes.io/projected/0239b380-03c8-455e-a981-2aaaae000828-kube-api-access-jhh59\") pod \"community-operators-9w55c\" (UID: \"0239b380-03c8-455e-a981-2aaaae000828\") " pod="openshift-marketplace/community-operators-9w55c" Jan 10 16:28:41 crc kubenswrapper[5036]: E0110 16:28:41.444181 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:41.944157005 +0000 UTC m=+43.814392499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.445036 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1513baef-e92c-4399-ae0f-b8fe4a738702-catalog-content\") pod \"certified-operators-hm8ns\" (UID: \"1513baef-e92c-4399-ae0f-b8fe4a738702\") " pod="openshift-marketplace/certified-operators-hm8ns" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.445081 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0239b380-03c8-455e-a981-2aaaae000828-catalog-content\") pod \"community-operators-9w55c\" (UID: \"0239b380-03c8-455e-a981-2aaaae000828\") " pod="openshift-marketplace/community-operators-9w55c" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.445145 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0239b380-03c8-455e-a981-2aaaae000828-utilities\") pod \"community-operators-9w55c\" (UID: \"0239b380-03c8-455e-a981-2aaaae000828\") " pod="openshift-marketplace/community-operators-9w55c" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.460289 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1513baef-e92c-4399-ae0f-b8fe4a738702-utilities\") pod \"certified-operators-hm8ns\" (UID: \"1513baef-e92c-4399-ae0f-b8fe4a738702\") " pod="openshift-marketplace/certified-operators-hm8ns" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.509857 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-czknz"] Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.511038 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-czknz" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.521071 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q7f7\" (UniqueName: \"kubernetes.io/projected/1513baef-e92c-4399-ae0f-b8fe4a738702-kube-api-access-7q7f7\") pod \"certified-operators-hm8ns\" (UID: \"1513baef-e92c-4399-ae0f-b8fe4a738702\") " pod="openshift-marketplace/certified-operators-hm8ns" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.526167 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhh59\" (UniqueName: \"kubernetes.io/projected/0239b380-03c8-455e-a981-2aaaae000828-kube-api-access-jhh59\") pod \"community-operators-9w55c\" (UID: \"0239b380-03c8-455e-a981-2aaaae000828\") " pod="openshift-marketplace/community-operators-9w55c" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.545345 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.545569 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvk7n\" (UniqueName: \"kubernetes.io/projected/3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9-kube-api-access-vvk7n\") pod \"community-operators-czknz\" (UID: \"3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9\") " pod="openshift-marketplace/community-operators-czknz" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.545617 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9-utilities\") pod \"community-operators-czknz\" (UID: \"3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9\") " pod="openshift-marketplace/community-operators-czknz" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.545707 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9-catalog-content\") pod \"community-operators-czknz\" (UID: \"3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9\") " pod="openshift-marketplace/community-operators-czknz" Jan 10 16:28:41 crc kubenswrapper[5036]: E0110 16:28:41.545813 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:42.045797405 +0000 UTC m=+43.916032899 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.579613 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-czknz"] Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.652636 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9-utilities\") pod \"community-operators-czknz\" (UID: \"3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9\") " pod="openshift-marketplace/community-operators-czknz" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.652723 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.652750 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9-catalog-content\") pod \"community-operators-czknz\" (UID: \"3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9\") " pod="openshift-marketplace/community-operators-czknz" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.652783 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvk7n\" (UniqueName: \"kubernetes.io/projected/3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9-kube-api-access-vvk7n\") pod \"community-operators-czknz\" (UID: \"3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9\") " pod="openshift-marketplace/community-operators-czknz" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.653395 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9-utilities\") pod \"community-operators-czknz\" (UID: \"3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9\") " pod="openshift-marketplace/community-operators-czknz" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.653623 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9-catalog-content\") pod \"community-operators-czknz\" (UID: \"3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9\") " pod="openshift-marketplace/community-operators-czknz" Jan 10 16:28:41 crc kubenswrapper[5036]: E0110 16:28:41.653723 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:42.153706394 +0000 UTC m=+44.023941888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.653779 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hm8ns" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.691141 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvk7n\" (UniqueName: \"kubernetes.io/projected/3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9-kube-api-access-vvk7n\") pod \"community-operators-czknz\" (UID: \"3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9\") " pod="openshift-marketplace/community-operators-czknz" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.730122 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lsfx8"] Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.731136 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lsfx8" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.731131 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w55c" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.755243 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:41 crc kubenswrapper[5036]: E0110 16:28:41.755815 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:42.255792456 +0000 UTC m=+44.126027950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.782327 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-5hjq7" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.803539 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lsfx8"] Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.857974 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.858046 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq8hl\" (UniqueName: \"kubernetes.io/projected/ea0d5867-9889-49bd-b23e-545606295a7a-kube-api-access-wq8hl\") pod \"certified-operators-lsfx8\" (UID: \"ea0d5867-9889-49bd-b23e-545606295a7a\") " pod="openshift-marketplace/certified-operators-lsfx8" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.858087 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0d5867-9889-49bd-b23e-545606295a7a-catalog-content\") pod \"certified-operators-lsfx8\" (UID: \"ea0d5867-9889-49bd-b23e-545606295a7a\") " pod="openshift-marketplace/certified-operators-lsfx8" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.858125 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0d5867-9889-49bd-b23e-545606295a7a-utilities\") pod \"certified-operators-lsfx8\" (UID: \"ea0d5867-9889-49bd-b23e-545606295a7a\") " pod="openshift-marketplace/certified-operators-lsfx8" Jan 10 16:28:41 crc kubenswrapper[5036]: E0110 16:28:41.858610 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:42.358590308 +0000 UTC m=+44.228825802 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.868043 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-czknz" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.960041 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.960333 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq8hl\" (UniqueName: \"kubernetes.io/projected/ea0d5867-9889-49bd-b23e-545606295a7a-kube-api-access-wq8hl\") pod \"certified-operators-lsfx8\" (UID: \"ea0d5867-9889-49bd-b23e-545606295a7a\") " pod="openshift-marketplace/certified-operators-lsfx8" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.960364 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0d5867-9889-49bd-b23e-545606295a7a-catalog-content\") pod \"certified-operators-lsfx8\" (UID: \"ea0d5867-9889-49bd-b23e-545606295a7a\") " pod="openshift-marketplace/certified-operators-lsfx8" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.960390 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0d5867-9889-49bd-b23e-545606295a7a-utilities\") pod \"certified-operators-lsfx8\" (UID: \"ea0d5867-9889-49bd-b23e-545606295a7a\") " pod="openshift-marketplace/certified-operators-lsfx8" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.960972 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0d5867-9889-49bd-b23e-545606295a7a-utilities\") pod \"certified-operators-lsfx8\" (UID: \"ea0d5867-9889-49bd-b23e-545606295a7a\") " pod="openshift-marketplace/certified-operators-lsfx8" Jan 10 16:28:41 crc kubenswrapper[5036]: E0110 16:28:41.961089 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:42.461050699 +0000 UTC m=+44.331286193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.961792 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0d5867-9889-49bd-b23e-545606295a7a-catalog-content\") pod \"certified-operators-lsfx8\" (UID: \"ea0d5867-9889-49bd-b23e-545606295a7a\") " pod="openshift-marketplace/certified-operators-lsfx8" Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.964517 5036 patch_prober.go:28] interesting pod/router-default-5444994796-kcb5k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 10 16:28:41 crc kubenswrapper[5036]: [-]has-synced failed: reason withheld Jan 10 16:28:41 crc kubenswrapper[5036]: [+]process-running ok Jan 10 16:28:41 crc kubenswrapper[5036]: healthz check failed Jan 10 16:28:41 crc kubenswrapper[5036]: I0110 16:28:41.964585 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcb5k" podUID="5620c8e3-4592-4189-b074-4ea40e9447ff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.010709 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq8hl\" (UniqueName: \"kubernetes.io/projected/ea0d5867-9889-49bd-b23e-545606295a7a-kube-api-access-wq8hl\") pod \"certified-operators-lsfx8\" (UID: \"ea0d5867-9889-49bd-b23e-545606295a7a\") " pod="openshift-marketplace/certified-operators-lsfx8" Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.057309 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lsfx8" Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.068741 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:42 crc kubenswrapper[5036]: E0110 16:28:42.069124 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:42.569112043 +0000 UTC m=+44.439347537 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.173983 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:42 crc kubenswrapper[5036]: E0110 16:28:42.174274 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:42.674257247 +0000 UTC m=+44.544492741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.223283 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" event={"ID":"79a5550c-f653-48bd-9199-53843401d87e","Type":"ContainerStarted","Data":"2af48053c486f2ecd303fe23028419a8dd9961db06fee49f91bf453da40de8c4"} Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.255868 5036 patch_prober.go:28] interesting pod/downloads-7954f5f757-km6m5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.256660 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-km6m5" podUID="22345551-25b1-48ef-8bfb-c4b4c10170fd" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.277393 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:42 crc kubenswrapper[5036]: E0110 16:28:42.287831 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:42.787811639 +0000 UTC m=+44.658047133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.299248 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.331916 5036 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.384352 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:42 crc kubenswrapper[5036]: E0110 16:28:42.387766 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:42.887736502 +0000 UTC m=+44.757971996 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.471955 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-czknz"] Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.487981 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:42 crc kubenswrapper[5036]: E0110 16:28:42.488742 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-10 16:28:42.988722054 +0000 UTC m=+44.858957548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mjcps" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.551369 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9w55c"] Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.551414 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.552083 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.552176 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.562091 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.562325 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.590993 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.591260 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ccdfa955-8cc4-40c5-a926-cc3969e0721d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ccdfa955-8cc4-40c5-a926-cc3969e0721d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.591328 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ccdfa955-8cc4-40c5-a926-cc3969e0721d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ccdfa955-8cc4-40c5-a926-cc3969e0721d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 10 16:28:42 crc kubenswrapper[5036]: E0110 16:28:42.591520 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-10 16:28:43.091499945 +0000 UTC m=+44.961735439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.610793 5036 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-10T16:28:42.331950603Z","Handler":null,"Name":""} Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.642352 5036 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.642397 5036 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.660904 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hm8ns"] Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.693259 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ccdfa955-8cc4-40c5-a926-cc3969e0721d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ccdfa955-8cc4-40c5-a926-cc3969e0721d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.693320 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.693371 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ccdfa955-8cc4-40c5-a926-cc3969e0721d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ccdfa955-8cc4-40c5-a926-cc3969e0721d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.693401 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ccdfa955-8cc4-40c5-a926-cc3969e0721d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ccdfa955-8cc4-40c5-a926-cc3969e0721d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.706455 5036 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.706511 5036 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:42 crc kubenswrapper[5036]: W0110 16:28:42.707735 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1513baef_e92c_4399_ae0f_b8fe4a738702.slice/crio-690c5256f10be1e066e92ef9c938a2d768a8ae9899e67eadf97429cc99ff4a92 WatchSource:0}: Error finding container 690c5256f10be1e066e92ef9c938a2d768a8ae9899e67eadf97429cc99ff4a92: Status 404 returned error can't find the container with id 690c5256f10be1e066e92ef9c938a2d768a8ae9899e67eadf97429cc99ff4a92 Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.736073 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ccdfa955-8cc4-40c5-a926-cc3969e0721d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ccdfa955-8cc4-40c5-a926-cc3969e0721d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.762832 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lsfx8"] Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.782638 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.782671 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.791603 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.793537 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mjcps\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.895901 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.900761 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.944106 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.956092 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-kcb5k" Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.960138 5036 patch_prober.go:28] interesting pod/router-default-5444994796-kcb5k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 10 16:28:42 crc kubenswrapper[5036]: [-]has-synced failed: reason withheld Jan 10 16:28:42 crc kubenswrapper[5036]: [+]process-running ok Jan 10 16:28:42 crc kubenswrapper[5036]: healthz check failed Jan 10 16:28:42 crc kubenswrapper[5036]: I0110 16:28:42.960211 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcb5k" podUID="5620c8e3-4592-4189-b074-4ea40e9447ff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.039007 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.208171 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.233187 5036 generic.go:334] "Generic (PLEG): container finished" podID="3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9" containerID="ee4c415993b049cf1be4535dff56f2ffb5448baa6ec368dc6f8cac7022f6a89a" exitCode=0 Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.233766 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-czknz" event={"ID":"3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9","Type":"ContainerDied","Data":"ee4c415993b049cf1be4535dff56f2ffb5448baa6ec368dc6f8cac7022f6a89a"} Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.233824 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-czknz" event={"ID":"3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9","Type":"ContainerStarted","Data":"dcdc83151b5c40ecda1d0738d7370b3ab6507acf76d50edd224d77db33ac104f"} Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.236321 5036 generic.go:334] "Generic (PLEG): container finished" podID="1513baef-e92c-4399-ae0f-b8fe4a738702" containerID="da41d0c8c9abc1724e8899212a0c4e949a52e45eaf87f1bc01623bb61e9c5909" exitCode=0 Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.236723 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hm8ns" event={"ID":"1513baef-e92c-4399-ae0f-b8fe4a738702","Type":"ContainerDied","Data":"da41d0c8c9abc1724e8899212a0c4e949a52e45eaf87f1bc01623bb61e9c5909"} Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.236751 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hm8ns" event={"ID":"1513baef-e92c-4399-ae0f-b8fe4a738702","Type":"ContainerStarted","Data":"690c5256f10be1e066e92ef9c938a2d768a8ae9899e67eadf97429cc99ff4a92"} Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.250217 5036 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.250806 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" event={"ID":"79a5550c-f653-48bd-9199-53843401d87e","Type":"ContainerStarted","Data":"d551a35d52cbf10e8e7e79fe539e68e9a941d59eb724c0458e66db4aa1bd4937"} Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.250864 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" event={"ID":"79a5550c-f653-48bd-9199-53843401d87e","Type":"ContainerStarted","Data":"2580658380e585b72c26c414d5768bdefead1ef10bfc7a1f06d396084e84ec2e"} Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.258078 5036 generic.go:334] "Generic (PLEG): container finished" podID="0239b380-03c8-455e-a981-2aaaae000828" containerID="472d2ac145d9668cd2ba2b48df2dc9c9521fd0bd8c638a0713a3dae19379b753" exitCode=0 Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.258148 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w55c" event={"ID":"0239b380-03c8-455e-a981-2aaaae000828","Type":"ContainerDied","Data":"472d2ac145d9668cd2ba2b48df2dc9c9521fd0bd8c638a0713a3dae19379b753"} Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.258175 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w55c" event={"ID":"0239b380-03c8-455e-a981-2aaaae000828","Type":"ContainerStarted","Data":"60b567199e6d9c4d632b9da33745bc57c86d7c89b57d95171070f1869d46c2a8"} Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.263750 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ccdfa955-8cc4-40c5-a926-cc3969e0721d","Type":"ContainerStarted","Data":"5a9916c2f4f25b2b9bfde786f2c16578c13f626c2213f7b905ae282cecbf2db8"} Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.265891 5036 generic.go:334] "Generic (PLEG): container finished" podID="ea0d5867-9889-49bd-b23e-545606295a7a" containerID="0d3fa54785d71a6ebcdc054c6f935cb5361813444368855e0226cec9ed56733c" exitCode=0 Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.269160 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsfx8" event={"ID":"ea0d5867-9889-49bd-b23e-545606295a7a","Type":"ContainerDied","Data":"0d3fa54785d71a6ebcdc054c6f935cb5361813444368855e0226cec9ed56733c"} Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.269210 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsfx8" event={"ID":"ea0d5867-9889-49bd-b23e-545606295a7a","Type":"ContainerStarted","Data":"7eb1372bb4f38659f5b9cce51f162d06cf5d3b2dd27d69562020d5bbc6f8a7b9"} Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.281822 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dpl6f" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.309434 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vzvbk"] Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.317450 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vzvbk" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.319571 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.343065 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-lc5jj" podStartSLOduration=12.343033988 podStartE2EDuration="12.343033988s" podCreationTimestamp="2026-01-10 16:28:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:43.29950169 +0000 UTC m=+45.169737184" watchObservedRunningTime="2026-01-10 16:28:43.343033988 +0000 UTC m=+45.213269482" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.344501 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vzvbk"] Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.358520 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mjcps"] Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.407593 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1efe898b-dc49-41f9-a296-84f826548896-catalog-content\") pod \"redhat-marketplace-vzvbk\" (UID: \"1efe898b-dc49-41f9-a296-84f826548896\") " pod="openshift-marketplace/redhat-marketplace-vzvbk" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.407643 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1efe898b-dc49-41f9-a296-84f826548896-utilities\") pod \"redhat-marketplace-vzvbk\" (UID: \"1efe898b-dc49-41f9-a296-84f826548896\") " pod="openshift-marketplace/redhat-marketplace-vzvbk" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.407671 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx69v\" (UniqueName: \"kubernetes.io/projected/1efe898b-dc49-41f9-a296-84f826548896-kube-api-access-wx69v\") pod \"redhat-marketplace-vzvbk\" (UID: \"1efe898b-dc49-41f9-a296-84f826548896\") " pod="openshift-marketplace/redhat-marketplace-vzvbk" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.508830 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1efe898b-dc49-41f9-a296-84f826548896-catalog-content\") pod \"redhat-marketplace-vzvbk\" (UID: \"1efe898b-dc49-41f9-a296-84f826548896\") " pod="openshift-marketplace/redhat-marketplace-vzvbk" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.508878 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1efe898b-dc49-41f9-a296-84f826548896-utilities\") pod \"redhat-marketplace-vzvbk\" (UID: \"1efe898b-dc49-41f9-a296-84f826548896\") " pod="openshift-marketplace/redhat-marketplace-vzvbk" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.508906 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx69v\" (UniqueName: \"kubernetes.io/projected/1efe898b-dc49-41f9-a296-84f826548896-kube-api-access-wx69v\") pod \"redhat-marketplace-vzvbk\" (UID: \"1efe898b-dc49-41f9-a296-84f826548896\") " pod="openshift-marketplace/redhat-marketplace-vzvbk" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.509540 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1efe898b-dc49-41f9-a296-84f826548896-catalog-content\") pod \"redhat-marketplace-vzvbk\" (UID: \"1efe898b-dc49-41f9-a296-84f826548896\") " pod="openshift-marketplace/redhat-marketplace-vzvbk" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.509771 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1efe898b-dc49-41f9-a296-84f826548896-utilities\") pod \"redhat-marketplace-vzvbk\" (UID: \"1efe898b-dc49-41f9-a296-84f826548896\") " pod="openshift-marketplace/redhat-marketplace-vzvbk" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.531660 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx69v\" (UniqueName: \"kubernetes.io/projected/1efe898b-dc49-41f9-a296-84f826548896-kube-api-access-wx69v\") pod \"redhat-marketplace-vzvbk\" (UID: \"1efe898b-dc49-41f9-a296-84f826548896\") " pod="openshift-marketplace/redhat-marketplace-vzvbk" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.570451 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.570510 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.578763 5036 patch_prober.go:28] interesting pod/apiserver-76f77b778f-7lh8w container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 10 16:28:43 crc kubenswrapper[5036]: [+]log ok Jan 10 16:28:43 crc kubenswrapper[5036]: [+]etcd ok Jan 10 16:28:43 crc kubenswrapper[5036]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 10 16:28:43 crc kubenswrapper[5036]: [+]poststarthook/generic-apiserver-start-informers ok Jan 10 16:28:43 crc kubenswrapper[5036]: [+]poststarthook/max-in-flight-filter ok Jan 10 16:28:43 crc kubenswrapper[5036]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 10 16:28:43 crc kubenswrapper[5036]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 10 16:28:43 crc kubenswrapper[5036]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 10 16:28:43 crc kubenswrapper[5036]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Jan 10 16:28:43 crc kubenswrapper[5036]: [+]poststarthook/project.openshift.io-projectcache ok Jan 10 16:28:43 crc kubenswrapper[5036]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 10 16:28:43 crc kubenswrapper[5036]: [+]poststarthook/openshift.io-startinformers ok Jan 10 16:28:43 crc kubenswrapper[5036]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 10 16:28:43 crc kubenswrapper[5036]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 10 16:28:43 crc kubenswrapper[5036]: livez check failed Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.578858 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" podUID="ec9ab704-1f8b-473b-bbe2-3c09d04991cd" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.646939 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vzvbk" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.685924 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v2wzb"] Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.687357 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v2wzb" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.711414 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v2wzb"] Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.711625 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grqzm\" (UniqueName: \"kubernetes.io/projected/9f0cd226-9f92-4ef2-a82b-7746983ab42e-kube-api-access-grqzm\") pod \"redhat-marketplace-v2wzb\" (UID: \"9f0cd226-9f92-4ef2-a82b-7746983ab42e\") " pod="openshift-marketplace/redhat-marketplace-v2wzb" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.711927 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f0cd226-9f92-4ef2-a82b-7746983ab42e-catalog-content\") pod \"redhat-marketplace-v2wzb\" (UID: \"9f0cd226-9f92-4ef2-a82b-7746983ab42e\") " pod="openshift-marketplace/redhat-marketplace-v2wzb" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.711977 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f0cd226-9f92-4ef2-a82b-7746983ab42e-utilities\") pod \"redhat-marketplace-v2wzb\" (UID: \"9f0cd226-9f92-4ef2-a82b-7746983ab42e\") " pod="openshift-marketplace/redhat-marketplace-v2wzb" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.813283 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grqzm\" (UniqueName: \"kubernetes.io/projected/9f0cd226-9f92-4ef2-a82b-7746983ab42e-kube-api-access-grqzm\") pod \"redhat-marketplace-v2wzb\" (UID: \"9f0cd226-9f92-4ef2-a82b-7746983ab42e\") " pod="openshift-marketplace/redhat-marketplace-v2wzb" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.813757 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f0cd226-9f92-4ef2-a82b-7746983ab42e-catalog-content\") pod \"redhat-marketplace-v2wzb\" (UID: \"9f0cd226-9f92-4ef2-a82b-7746983ab42e\") " pod="openshift-marketplace/redhat-marketplace-v2wzb" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.813793 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f0cd226-9f92-4ef2-a82b-7746983ab42e-utilities\") pod \"redhat-marketplace-v2wzb\" (UID: \"9f0cd226-9f92-4ef2-a82b-7746983ab42e\") " pod="openshift-marketplace/redhat-marketplace-v2wzb" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.814341 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f0cd226-9f92-4ef2-a82b-7746983ab42e-utilities\") pod \"redhat-marketplace-v2wzb\" (UID: \"9f0cd226-9f92-4ef2-a82b-7746983ab42e\") " pod="openshift-marketplace/redhat-marketplace-v2wzb" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.814404 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f0cd226-9f92-4ef2-a82b-7746983ab42e-catalog-content\") pod \"redhat-marketplace-v2wzb\" (UID: \"9f0cd226-9f92-4ef2-a82b-7746983ab42e\") " pod="openshift-marketplace/redhat-marketplace-v2wzb" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.840166 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grqzm\" (UniqueName: \"kubernetes.io/projected/9f0cd226-9f92-4ef2-a82b-7746983ab42e-kube-api-access-grqzm\") pod \"redhat-marketplace-v2wzb\" (UID: \"9f0cd226-9f92-4ef2-a82b-7746983ab42e\") " pod="openshift-marketplace/redhat-marketplace-v2wzb" Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.942023 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vzvbk"] Jan 10 16:28:43 crc kubenswrapper[5036]: W0110 16:28:43.949976 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1efe898b_dc49_41f9_a296_84f826548896.slice/crio-a27d7332b755469449bfece3cb9968b9e69d3f3e62ebd6cbce90ccfb06a0c787 WatchSource:0}: Error finding container a27d7332b755469449bfece3cb9968b9e69d3f3e62ebd6cbce90ccfb06a0c787: Status 404 returned error can't find the container with id a27d7332b755469449bfece3cb9968b9e69d3f3e62ebd6cbce90ccfb06a0c787 Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.970991 5036 patch_prober.go:28] interesting pod/router-default-5444994796-kcb5k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 10 16:28:43 crc kubenswrapper[5036]: [-]has-synced failed: reason withheld Jan 10 16:28:43 crc kubenswrapper[5036]: [+]process-running ok Jan 10 16:28:43 crc kubenswrapper[5036]: healthz check failed Jan 10 16:28:43 crc kubenswrapper[5036]: I0110 16:28:43.971056 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcb5k" podUID="5620c8e3-4592-4189-b074-4ea40e9447ff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.017069 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v2wzb" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.293368 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q72r7"] Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.294923 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q72r7" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.297169 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.304209 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q72r7"] Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.311524 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" event={"ID":"d8d9ae9f-271e-402d-8ec6-a2e25057090e","Type":"ContainerStarted","Data":"0e0f6c24e4ed9a299edf96b9057e859842eea2dbe60d2977ed60e9c399ad3669"} Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.311573 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" event={"ID":"d8d9ae9f-271e-402d-8ec6-a2e25057090e","Type":"ContainerStarted","Data":"5c46d35bab5af69a8e35b49ae719acb555344724cbb82e6100619425a6639311"} Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.311652 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.320389 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe3cdeec-7336-463c-bbbb-488ece81fa0b-catalog-content\") pod \"redhat-operators-q72r7\" (UID: \"fe3cdeec-7336-463c-bbbb-488ece81fa0b\") " pod="openshift-marketplace/redhat-operators-q72r7" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.320450 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe3cdeec-7336-463c-bbbb-488ece81fa0b-utilities\") pod \"redhat-operators-q72r7\" (UID: \"fe3cdeec-7336-463c-bbbb-488ece81fa0b\") " pod="openshift-marketplace/redhat-operators-q72r7" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.320521 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh2wk\" (UniqueName: \"kubernetes.io/projected/fe3cdeec-7336-463c-bbbb-488ece81fa0b-kube-api-access-mh2wk\") pod \"redhat-operators-q72r7\" (UID: \"fe3cdeec-7336-463c-bbbb-488ece81fa0b\") " pod="openshift-marketplace/redhat-operators-q72r7" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.326851 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ccdfa955-8cc4-40c5-a926-cc3969e0721d","Type":"ContainerStarted","Data":"d599b7da09b5a8bcd0b6d1edff7e8cc2cc16e25abdf8363c5b20c295b2b67d40"} Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.330447 5036 generic.go:334] "Generic (PLEG): container finished" podID="1efe898b-dc49-41f9-a296-84f826548896" containerID="ef1939da0fd9a512cc6ff925d37bfe70051f6b8265dff59f6a5dcf0c7aaf8574" exitCode=0 Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.330788 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vzvbk" event={"ID":"1efe898b-dc49-41f9-a296-84f826548896","Type":"ContainerDied","Data":"ef1939da0fd9a512cc6ff925d37bfe70051f6b8265dff59f6a5dcf0c7aaf8574"} Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.330844 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vzvbk" event={"ID":"1efe898b-dc49-41f9-a296-84f826548896","Type":"ContainerStarted","Data":"a27d7332b755469449bfece3cb9968b9e69d3f3e62ebd6cbce90ccfb06a0c787"} Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.393393 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" podStartSLOduration=26.393373314 podStartE2EDuration="26.393373314s" podCreationTimestamp="2026-01-10 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:44.346868186 +0000 UTC m=+46.217103700" watchObservedRunningTime="2026-01-10 16:28:44.393373314 +0000 UTC m=+46.263608808" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.396891 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v2wzb"] Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.415782 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.41576442 podStartE2EDuration="2.41576442s" podCreationTimestamp="2026-01-10 16:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:44.414004303 +0000 UTC m=+46.284239797" watchObservedRunningTime="2026-01-10 16:28:44.41576442 +0000 UTC m=+46.285999924" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.421322 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh2wk\" (UniqueName: \"kubernetes.io/projected/fe3cdeec-7336-463c-bbbb-488ece81fa0b-kube-api-access-mh2wk\") pod \"redhat-operators-q72r7\" (UID: \"fe3cdeec-7336-463c-bbbb-488ece81fa0b\") " pod="openshift-marketplace/redhat-operators-q72r7" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.421465 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe3cdeec-7336-463c-bbbb-488ece81fa0b-catalog-content\") pod \"redhat-operators-q72r7\" (UID: \"fe3cdeec-7336-463c-bbbb-488ece81fa0b\") " pod="openshift-marketplace/redhat-operators-q72r7" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.421547 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe3cdeec-7336-463c-bbbb-488ece81fa0b-utilities\") pod \"redhat-operators-q72r7\" (UID: \"fe3cdeec-7336-463c-bbbb-488ece81fa0b\") " pod="openshift-marketplace/redhat-operators-q72r7" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.422123 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe3cdeec-7336-463c-bbbb-488ece81fa0b-catalog-content\") pod \"redhat-operators-q72r7\" (UID: \"fe3cdeec-7336-463c-bbbb-488ece81fa0b\") " pod="openshift-marketplace/redhat-operators-q72r7" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.433937 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe3cdeec-7336-463c-bbbb-488ece81fa0b-utilities\") pod \"redhat-operators-q72r7\" (UID: \"fe3cdeec-7336-463c-bbbb-488ece81fa0b\") " pod="openshift-marketplace/redhat-operators-q72r7" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.466854 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh2wk\" (UniqueName: \"kubernetes.io/projected/fe3cdeec-7336-463c-bbbb-488ece81fa0b-kube-api-access-mh2wk\") pod \"redhat-operators-q72r7\" (UID: \"fe3cdeec-7336-463c-bbbb-488ece81fa0b\") " pod="openshift-marketplace/redhat-operators-q72r7" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.537977 5036 patch_prober.go:28] interesting pod/console-f9d7485db-bvg6n container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.32:8443/health\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.538039 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-bvg6n" podUID="d1559e8b-1a4d-4929-80cc-235f23048dd6" containerName="console" probeResult="failure" output="Get \"https://10.217.0.32:8443/health\": dial tcp 10.217.0.32:8443: connect: connection refused" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.575583 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.576332 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.576362 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.625459 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q72r7" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.672302 5036 patch_prober.go:28] interesting pod/downloads-7954f5f757-km6m5 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.672967 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-km6m5" podUID="22345551-25b1-48ef-8bfb-c4b4c10170fd" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.673274 5036 patch_prober.go:28] interesting pod/downloads-7954f5f757-km6m5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.673306 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-km6m5" podUID="22345551-25b1-48ef-8bfb-c4b4c10170fd" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.685994 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8qz7l"] Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.687294 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8qz7l" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.697888 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8qz7l"] Jan 10 16:28:44 crc kubenswrapper[5036]: E0110 16:28:44.720879 5036 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 10 16:28:44 crc kubenswrapper[5036]: E0110 16:28:44.732334 5036 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 10 16:28:44 crc kubenswrapper[5036]: E0110 16:28:44.735646 5036 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 10 16:28:44 crc kubenswrapper[5036]: E0110 16:28:44.735757 5036 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" podUID="56edcbe7-428c-4373-928d-b2fdf97a0a3a" containerName="kube-multus-additional-cni-plugins" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.826429 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrjbz\" (UniqueName: \"kubernetes.io/projected/963d9e81-5aca-4e34-b326-ffb47bcf98ba-kube-api-access-mrjbz\") pod \"redhat-operators-8qz7l\" (UID: \"963d9e81-5aca-4e34-b326-ffb47bcf98ba\") " pod="openshift-marketplace/redhat-operators-8qz7l" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.826545 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/963d9e81-5aca-4e34-b326-ffb47bcf98ba-utilities\") pod \"redhat-operators-8qz7l\" (UID: \"963d9e81-5aca-4e34-b326-ffb47bcf98ba\") " pod="openshift-marketplace/redhat-operators-8qz7l" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.826609 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/963d9e81-5aca-4e34-b326-ffb47bcf98ba-catalog-content\") pod \"redhat-operators-8qz7l\" (UID: \"963d9e81-5aca-4e34-b326-ffb47bcf98ba\") " pod="openshift-marketplace/redhat-operators-8qz7l" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.928486 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/963d9e81-5aca-4e34-b326-ffb47bcf98ba-catalog-content\") pod \"redhat-operators-8qz7l\" (UID: \"963d9e81-5aca-4e34-b326-ffb47bcf98ba\") " pod="openshift-marketplace/redhat-operators-8qz7l" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.929121 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrjbz\" (UniqueName: \"kubernetes.io/projected/963d9e81-5aca-4e34-b326-ffb47bcf98ba-kube-api-access-mrjbz\") pod \"redhat-operators-8qz7l\" (UID: \"963d9e81-5aca-4e34-b326-ffb47bcf98ba\") " pod="openshift-marketplace/redhat-operators-8qz7l" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.929048 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/963d9e81-5aca-4e34-b326-ffb47bcf98ba-catalog-content\") pod \"redhat-operators-8qz7l\" (UID: \"963d9e81-5aca-4e34-b326-ffb47bcf98ba\") " pod="openshift-marketplace/redhat-operators-8qz7l" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.930604 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/963d9e81-5aca-4e34-b326-ffb47bcf98ba-utilities\") pod \"redhat-operators-8qz7l\" (UID: \"963d9e81-5aca-4e34-b326-ffb47bcf98ba\") " pod="openshift-marketplace/redhat-operators-8qz7l" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.931918 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/963d9e81-5aca-4e34-b326-ffb47bcf98ba-utilities\") pod \"redhat-operators-8qz7l\" (UID: \"963d9e81-5aca-4e34-b326-ffb47bcf98ba\") " pod="openshift-marketplace/redhat-operators-8qz7l" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.954463 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrjbz\" (UniqueName: \"kubernetes.io/projected/963d9e81-5aca-4e34-b326-ffb47bcf98ba-kube-api-access-mrjbz\") pod \"redhat-operators-8qz7l\" (UID: \"963d9e81-5aca-4e34-b326-ffb47bcf98ba\") " pod="openshift-marketplace/redhat-operators-8qz7l" Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.963285 5036 patch_prober.go:28] interesting pod/router-default-5444994796-kcb5k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 10 16:28:44 crc kubenswrapper[5036]: [-]has-synced failed: reason withheld Jan 10 16:28:44 crc kubenswrapper[5036]: [+]process-running ok Jan 10 16:28:44 crc kubenswrapper[5036]: healthz check failed Jan 10 16:28:44 crc kubenswrapper[5036]: I0110 16:28:44.963380 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcb5k" podUID="5620c8e3-4592-4189-b074-4ea40e9447ff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 10 16:28:45 crc kubenswrapper[5036]: I0110 16:28:45.016639 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8qz7l" Jan 10 16:28:45 crc kubenswrapper[5036]: I0110 16:28:45.089611 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q72r7"] Jan 10 16:28:45 crc kubenswrapper[5036]: W0110 16:28:45.100341 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe3cdeec_7336_463c_bbbb_488ece81fa0b.slice/crio-40913a4a3d0e246e3ec19749b2e2cdb4156de220cccc3b6fd6bccdc2b64803d5 WatchSource:0}: Error finding container 40913a4a3d0e246e3ec19749b2e2cdb4156de220cccc3b6fd6bccdc2b64803d5: Status 404 returned error can't find the container with id 40913a4a3d0e246e3ec19749b2e2cdb4156de220cccc3b6fd6bccdc2b64803d5 Jan 10 16:28:45 crc kubenswrapper[5036]: I0110 16:28:45.363374 5036 generic.go:334] "Generic (PLEG): container finished" podID="9f0cd226-9f92-4ef2-a82b-7746983ab42e" containerID="8d1c63c1d91eab3349633de5ff5f50cd7bde0343e7f590ed1f4b9f046509dd06" exitCode=0 Jan 10 16:28:45 crc kubenswrapper[5036]: I0110 16:28:45.367016 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v2wzb" event={"ID":"9f0cd226-9f92-4ef2-a82b-7746983ab42e","Type":"ContainerDied","Data":"8d1c63c1d91eab3349633de5ff5f50cd7bde0343e7f590ed1f4b9f046509dd06"} Jan 10 16:28:45 crc kubenswrapper[5036]: I0110 16:28:45.367060 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v2wzb" event={"ID":"9f0cd226-9f92-4ef2-a82b-7746983ab42e","Type":"ContainerStarted","Data":"c402861909e7898d6ccb6f0f55744ee44b4d1c60b6157a126658c3077bf9bdb5"} Jan 10 16:28:45 crc kubenswrapper[5036]: I0110 16:28:45.381801 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q72r7" event={"ID":"fe3cdeec-7336-463c-bbbb-488ece81fa0b","Type":"ContainerStarted","Data":"40913a4a3d0e246e3ec19749b2e2cdb4156de220cccc3b6fd6bccdc2b64803d5"} Jan 10 16:28:45 crc kubenswrapper[5036]: I0110 16:28:45.412251 5036 generic.go:334] "Generic (PLEG): container finished" podID="ccdfa955-8cc4-40c5-a926-cc3969e0721d" containerID="d599b7da09b5a8bcd0b6d1edff7e8cc2cc16e25abdf8363c5b20c295b2b67d40" exitCode=0 Jan 10 16:28:45 crc kubenswrapper[5036]: I0110 16:28:45.412613 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ccdfa955-8cc4-40c5-a926-cc3969e0721d","Type":"ContainerDied","Data":"d599b7da09b5a8bcd0b6d1edff7e8cc2cc16e25abdf8363c5b20c295b2b67d40"} Jan 10 16:28:45 crc kubenswrapper[5036]: I0110 16:28:45.450950 5036 generic.go:334] "Generic (PLEG): container finished" podID="8ee18389-eb4f-4c7b-98bf-2f9785f21ce4" containerID="27662017b13517331e40f4940e69f8313a92f2b48d53a21db24568d97f34793a" exitCode=0 Jan 10 16:28:45 crc kubenswrapper[5036]: I0110 16:28:45.451467 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29467695-kv4q7" event={"ID":"8ee18389-eb4f-4c7b-98bf-2f9785f21ce4","Type":"ContainerDied","Data":"27662017b13517331e40f4940e69f8313a92f2b48d53a21db24568d97f34793a"} Jan 10 16:28:45 crc kubenswrapper[5036]: I0110 16:28:45.590429 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8qz7l"] Jan 10 16:28:45 crc kubenswrapper[5036]: W0110 16:28:45.615993 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod963d9e81_5aca_4e34_b326_ffb47bcf98ba.slice/crio-51d735fac832c36544bc72aa72579549ce1053afce368e295ff8c6fc0996121e WatchSource:0}: Error finding container 51d735fac832c36544bc72aa72579549ce1053afce368e295ff8c6fc0996121e: Status 404 returned error can't find the container with id 51d735fac832c36544bc72aa72579549ce1053afce368e295ff8c6fc0996121e Jan 10 16:28:45 crc kubenswrapper[5036]: I0110 16:28:45.963987 5036 patch_prober.go:28] interesting pod/router-default-5444994796-kcb5k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 10 16:28:45 crc kubenswrapper[5036]: [-]has-synced failed: reason withheld Jan 10 16:28:45 crc kubenswrapper[5036]: [+]process-running ok Jan 10 16:28:45 crc kubenswrapper[5036]: healthz check failed Jan 10 16:28:45 crc kubenswrapper[5036]: I0110 16:28:45.964080 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcb5k" podUID="5620c8e3-4592-4189-b074-4ea40e9447ff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 10 16:28:46 crc kubenswrapper[5036]: I0110 16:28:46.460740 5036 generic.go:334] "Generic (PLEG): container finished" podID="963d9e81-5aca-4e34-b326-ffb47bcf98ba" containerID="6f2db35f8417c37b10db412b93db32bf712c3f092e328d4bb0f028c96f093ca3" exitCode=0 Jan 10 16:28:46 crc kubenswrapper[5036]: I0110 16:28:46.460822 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qz7l" event={"ID":"963d9e81-5aca-4e34-b326-ffb47bcf98ba","Type":"ContainerDied","Data":"6f2db35f8417c37b10db412b93db32bf712c3f092e328d4bb0f028c96f093ca3"} Jan 10 16:28:46 crc kubenswrapper[5036]: I0110 16:28:46.460896 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qz7l" event={"ID":"963d9e81-5aca-4e34-b326-ffb47bcf98ba","Type":"ContainerStarted","Data":"51d735fac832c36544bc72aa72579549ce1053afce368e295ff8c6fc0996121e"} Jan 10 16:28:46 crc kubenswrapper[5036]: I0110 16:28:46.465311 5036 generic.go:334] "Generic (PLEG): container finished" podID="fe3cdeec-7336-463c-bbbb-488ece81fa0b" containerID="31769f7fa39ebd466438bf4d712640900fc059fe58448a431ca9455f750bf032" exitCode=0 Jan 10 16:28:46 crc kubenswrapper[5036]: I0110 16:28:46.465411 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q72r7" event={"ID":"fe3cdeec-7336-463c-bbbb-488ece81fa0b","Type":"ContainerDied","Data":"31769f7fa39ebd466438bf4d712640900fc059fe58448a431ca9455f750bf032"} Jan 10 16:28:46 crc kubenswrapper[5036]: I0110 16:28:46.845251 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29467695-kv4q7" Jan 10 16:28:46 crc kubenswrapper[5036]: I0110 16:28:46.890490 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 10 16:28:46 crc kubenswrapper[5036]: I0110 16:28:46.974409 5036 patch_prober.go:28] interesting pod/router-default-5444994796-kcb5k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 10 16:28:46 crc kubenswrapper[5036]: [-]has-synced failed: reason withheld Jan 10 16:28:46 crc kubenswrapper[5036]: [+]process-running ok Jan 10 16:28:46 crc kubenswrapper[5036]: healthz check failed Jan 10 16:28:46 crc kubenswrapper[5036]: I0110 16:28:46.974525 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcb5k" podUID="5620c8e3-4592-4189-b074-4ea40e9447ff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.006538 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.014901 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ccdfa955-8cc4-40c5-a926-cc3969e0721d-kubelet-dir\") pod \"ccdfa955-8cc4-40c5-a926-cc3969e0721d\" (UID: \"ccdfa955-8cc4-40c5-a926-cc3969e0721d\") " Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.015026 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ccdfa955-8cc4-40c5-a926-cc3969e0721d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ccdfa955-8cc4-40c5-a926-cc3969e0721d" (UID: "ccdfa955-8cc4-40c5-a926-cc3969e0721d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.015071 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ee18389-eb4f-4c7b-98bf-2f9785f21ce4-secret-volume\") pod \"8ee18389-eb4f-4c7b-98bf-2f9785f21ce4\" (UID: \"8ee18389-eb4f-4c7b-98bf-2f9785f21ce4\") " Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.015179 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ccdfa955-8cc4-40c5-a926-cc3969e0721d-kube-api-access\") pod \"ccdfa955-8cc4-40c5-a926-cc3969e0721d\" (UID: \"ccdfa955-8cc4-40c5-a926-cc3969e0721d\") " Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.016499 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml929\" (UniqueName: \"kubernetes.io/projected/8ee18389-eb4f-4c7b-98bf-2f9785f21ce4-kube-api-access-ml929\") pod \"8ee18389-eb4f-4c7b-98bf-2f9785f21ce4\" (UID: \"8ee18389-eb4f-4c7b-98bf-2f9785f21ce4\") " Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.016546 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ee18389-eb4f-4c7b-98bf-2f9785f21ce4-config-volume\") pod \"8ee18389-eb4f-4c7b-98bf-2f9785f21ce4\" (UID: \"8ee18389-eb4f-4c7b-98bf-2f9785f21ce4\") " Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.018067 5036 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ccdfa955-8cc4-40c5-a926-cc3969e0721d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.019867 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ee18389-eb4f-4c7b-98bf-2f9785f21ce4-config-volume" (OuterVolumeSpecName: "config-volume") pod "8ee18389-eb4f-4c7b-98bf-2f9785f21ce4" (UID: "8ee18389-eb4f-4c7b-98bf-2f9785f21ce4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.028924 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccdfa955-8cc4-40c5-a926-cc3969e0721d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ccdfa955-8cc4-40c5-a926-cc3969e0721d" (UID: "ccdfa955-8cc4-40c5-a926-cc3969e0721d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.031493 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.032325 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ee18389-eb4f-4c7b-98bf-2f9785f21ce4-kube-api-access-ml929" (OuterVolumeSpecName: "kube-api-access-ml929") pod "8ee18389-eb4f-4c7b-98bf-2f9785f21ce4" (UID: "8ee18389-eb4f-4c7b-98bf-2f9785f21ce4"). InnerVolumeSpecName "kube-api-access-ml929". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.032619 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ee18389-eb4f-4c7b-98bf-2f9785f21ce4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8ee18389-eb4f-4c7b-98bf-2f9785f21ce4" (UID: "8ee18389-eb4f-4c7b-98bf-2f9785f21ce4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.118586 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml929\" (UniqueName: \"kubernetes.io/projected/8ee18389-eb4f-4c7b-98bf-2f9785f21ce4-kube-api-access-ml929\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.118632 5036 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8ee18389-eb4f-4c7b-98bf-2f9785f21ce4-config-volume\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.118645 5036 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8ee18389-eb4f-4c7b-98bf-2f9785f21ce4-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.118657 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ccdfa955-8cc4-40c5-a926-cc3969e0721d-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.509637 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ccdfa955-8cc4-40c5-a926-cc3969e0721d","Type":"ContainerDied","Data":"5a9916c2f4f25b2b9bfde786f2c16578c13f626c2213f7b905ae282cecbf2db8"} Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.509762 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a9916c2f4f25b2b9bfde786f2c16578c13f626c2213f7b905ae282cecbf2db8" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.509981 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.516734 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29467695-kv4q7" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.516715 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29467695-kv4q7" event={"ID":"8ee18389-eb4f-4c7b-98bf-2f9785f21ce4","Type":"ContainerDied","Data":"e30d4a6522341d415e7b16900762a8b1b702e55abd5aafe38dc1153573fe767a"} Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.516815 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e30d4a6522341d415e7b16900762a8b1b702e55abd5aafe38dc1153573fe767a" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.576904 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=0.576868952 podStartE2EDuration="576.868952ms" podCreationTimestamp="2026-01-10 16:28:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:47.565597997 +0000 UTC m=+49.435833511" watchObservedRunningTime="2026-01-10 16:28:47.576868952 +0000 UTC m=+49.447104466" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.581439 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 10 16:28:47 crc kubenswrapper[5036]: E0110 16:28:47.589436 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccdfa955-8cc4-40c5-a926-cc3969e0721d" containerName="pruner" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.589476 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccdfa955-8cc4-40c5-a926-cc3969e0721d" containerName="pruner" Jan 10 16:28:47 crc kubenswrapper[5036]: E0110 16:28:47.589518 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee18389-eb4f-4c7b-98bf-2f9785f21ce4" containerName="collect-profiles" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.589527 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee18389-eb4f-4c7b-98bf-2f9785f21ce4" containerName="collect-profiles" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.589740 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccdfa955-8cc4-40c5-a926-cc3969e0721d" containerName="pruner" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.589766 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ee18389-eb4f-4c7b-98bf-2f9785f21ce4" containerName="collect-profiles" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.590359 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.596462 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.600337 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.600423 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.627512 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ddbb1e96-d05d-43be-b0f5-7d5e01d707a3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ddbb1e96-d05d-43be-b0f5-7d5e01d707a3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.627593 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ddbb1e96-d05d-43be-b0f5-7d5e01d707a3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ddbb1e96-d05d-43be-b0f5-7d5e01d707a3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.728737 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ddbb1e96-d05d-43be-b0f5-7d5e01d707a3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ddbb1e96-d05d-43be-b0f5-7d5e01d707a3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.728893 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ddbb1e96-d05d-43be-b0f5-7d5e01d707a3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ddbb1e96-d05d-43be-b0f5-7d5e01d707a3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.728830 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ddbb1e96-d05d-43be-b0f5-7d5e01d707a3-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ddbb1e96-d05d-43be-b0f5-7d5e01d707a3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.751764 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ddbb1e96-d05d-43be-b0f5-7d5e01d707a3-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ddbb1e96-d05d-43be-b0f5-7d5e01d707a3\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.916733 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.960193 5036 patch_prober.go:28] interesting pod/router-default-5444994796-kcb5k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 10 16:28:47 crc kubenswrapper[5036]: [-]has-synced failed: reason withheld Jan 10 16:28:47 crc kubenswrapper[5036]: [+]process-running ok Jan 10 16:28:47 crc kubenswrapper[5036]: healthz check failed Jan 10 16:28:47 crc kubenswrapper[5036]: I0110 16:28:47.960264 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcb5k" podUID="5620c8e3-4592-4189-b074-4ea40e9447ff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 10 16:28:48 crc kubenswrapper[5036]: I0110 16:28:48.314754 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 10 16:28:48 crc kubenswrapper[5036]: W0110 16:28:48.355221 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podddbb1e96_d05d_43be_b0f5_7d5e01d707a3.slice/crio-cd8fbe9f32377925e8a82c326667d3e8753e68270a02862a151f3eb9d94dacd6 WatchSource:0}: Error finding container cd8fbe9f32377925e8a82c326667d3e8753e68270a02862a151f3eb9d94dacd6: Status 404 returned error can't find the container with id cd8fbe9f32377925e8a82c326667d3e8753e68270a02862a151f3eb9d94dacd6 Jan 10 16:28:48 crc kubenswrapper[5036]: I0110 16:28:48.576623 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:48 crc kubenswrapper[5036]: I0110 16:28:48.582797 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-7lh8w" Jan 10 16:28:48 crc kubenswrapper[5036]: I0110 16:28:48.600127 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ddbb1e96-d05d-43be-b0f5-7d5e01d707a3","Type":"ContainerStarted","Data":"cd8fbe9f32377925e8a82c326667d3e8753e68270a02862a151f3eb9d94dacd6"} Jan 10 16:28:48 crc kubenswrapper[5036]: I0110 16:28:48.964403 5036 patch_prober.go:28] interesting pod/router-default-5444994796-kcb5k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 10 16:28:48 crc kubenswrapper[5036]: [-]has-synced failed: reason withheld Jan 10 16:28:48 crc kubenswrapper[5036]: [+]process-running ok Jan 10 16:28:48 crc kubenswrapper[5036]: healthz check failed Jan 10 16:28:48 crc kubenswrapper[5036]: I0110 16:28:48.965910 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcb5k" podUID="5620c8e3-4592-4189-b074-4ea40e9447ff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 10 16:28:49 crc kubenswrapper[5036]: I0110 16:28:49.794376 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-w6mpm" Jan 10 16:28:49 crc kubenswrapper[5036]: I0110 16:28:49.810275 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ddbb1e96-d05d-43be-b0f5-7d5e01d707a3","Type":"ContainerStarted","Data":"ce2d1e4e1a7a4d8828114cc415c7a7b575d1c586126548b6d4c2472f420bfcc7"} Jan 10 16:28:49 crc kubenswrapper[5036]: I0110 16:28:49.902444 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.902416478 podStartE2EDuration="2.902416478s" podCreationTimestamp="2026-01-10 16:28:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:28:49.891926164 +0000 UTC m=+51.762161658" watchObservedRunningTime="2026-01-10 16:28:49.902416478 +0000 UTC m=+51.772651972" Jan 10 16:28:49 crc kubenswrapper[5036]: I0110 16:28:49.959060 5036 patch_prober.go:28] interesting pod/router-default-5444994796-kcb5k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 10 16:28:49 crc kubenswrapper[5036]: [-]has-synced failed: reason withheld Jan 10 16:28:49 crc kubenswrapper[5036]: [+]process-running ok Jan 10 16:28:49 crc kubenswrapper[5036]: healthz check failed Jan 10 16:28:49 crc kubenswrapper[5036]: I0110 16:28:49.959149 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcb5k" podUID="5620c8e3-4592-4189-b074-4ea40e9447ff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 10 16:28:50 crc kubenswrapper[5036]: I0110 16:28:50.959404 5036 patch_prober.go:28] interesting pod/router-default-5444994796-kcb5k container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 10 16:28:50 crc kubenswrapper[5036]: [-]has-synced failed: reason withheld Jan 10 16:28:50 crc kubenswrapper[5036]: [+]process-running ok Jan 10 16:28:50 crc kubenswrapper[5036]: healthz check failed Jan 10 16:28:50 crc kubenswrapper[5036]: I0110 16:28:50.959475 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kcb5k" podUID="5620c8e3-4592-4189-b074-4ea40e9447ff" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 10 16:28:51 crc kubenswrapper[5036]: I0110 16:28:51.859135 5036 generic.go:334] "Generic (PLEG): container finished" podID="ddbb1e96-d05d-43be-b0f5-7d5e01d707a3" containerID="ce2d1e4e1a7a4d8828114cc415c7a7b575d1c586126548b6d4c2472f420bfcc7" exitCode=0 Jan 10 16:28:51 crc kubenswrapper[5036]: I0110 16:28:51.859577 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ddbb1e96-d05d-43be-b0f5-7d5e01d707a3","Type":"ContainerDied","Data":"ce2d1e4e1a7a4d8828114cc415c7a7b575d1c586126548b6d4c2472f420bfcc7"} Jan 10 16:28:51 crc kubenswrapper[5036]: I0110 16:28:51.960532 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-kcb5k" Jan 10 16:28:51 crc kubenswrapper[5036]: I0110 16:28:51.968281 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-kcb5k" Jan 10 16:28:54 crc kubenswrapper[5036]: I0110 16:28:54.536707 5036 patch_prober.go:28] interesting pod/console-f9d7485db-bvg6n container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.32:8443/health\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Jan 10 16:28:54 crc kubenswrapper[5036]: I0110 16:28:54.537216 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-bvg6n" podUID="d1559e8b-1a4d-4929-80cc-235f23048dd6" containerName="console" probeResult="failure" output="Get \"https://10.217.0.32:8443/health\": dial tcp 10.217.0.32:8443: connect: connection refused" Jan 10 16:28:54 crc kubenswrapper[5036]: I0110 16:28:54.677590 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-km6m5" Jan 10 16:28:54 crc kubenswrapper[5036]: E0110 16:28:54.699089 5036 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 10 16:28:54 crc kubenswrapper[5036]: E0110 16:28:54.751395 5036 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 10 16:28:54 crc kubenswrapper[5036]: E0110 16:28:54.767196 5036 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 10 16:28:54 crc kubenswrapper[5036]: E0110 16:28:54.767301 5036 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" podUID="56edcbe7-428c-4373-928d-b2fdf97a0a3a" containerName="kube-multus-additional-cni-plugins" Jan 10 16:28:55 crc kubenswrapper[5036]: I0110 16:28:55.647355 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9r9hf"] Jan 10 16:28:55 crc kubenswrapper[5036]: I0110 16:28:55.647711 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" podUID="792beb3d-c532-4c80-8ab7-3024b5db8512" containerName="controller-manager" containerID="cri-o://e84d04a2fdff42c4d5d6844eaf4e32d6562610178581a08f7812e91c8f66191a" gracePeriod=30 Jan 10 16:28:55 crc kubenswrapper[5036]: I0110 16:28:55.732542 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd"] Jan 10 16:28:55 crc kubenswrapper[5036]: I0110 16:28:55.733086 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" podUID="e5ea287e-5a20-4798-8f4b-4f2d0e5b1581" containerName="route-controller-manager" containerID="cri-o://f2e43ef035148b716424bf58d83371843de23f90899e03d9effeaa90ce4e3c37" gracePeriod=30 Jan 10 16:28:56 crc kubenswrapper[5036]: I0110 16:28:56.921487 5036 generic.go:334] "Generic (PLEG): container finished" podID="e5ea287e-5a20-4798-8f4b-4f2d0e5b1581" containerID="f2e43ef035148b716424bf58d83371843de23f90899e03d9effeaa90ce4e3c37" exitCode=0 Jan 10 16:28:56 crc kubenswrapper[5036]: I0110 16:28:56.921582 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" event={"ID":"e5ea287e-5a20-4798-8f4b-4f2d0e5b1581","Type":"ContainerDied","Data":"f2e43ef035148b716424bf58d83371843de23f90899e03d9effeaa90ce4e3c37"} Jan 10 16:28:56 crc kubenswrapper[5036]: I0110 16:28:56.925042 5036 generic.go:334] "Generic (PLEG): container finished" podID="792beb3d-c532-4c80-8ab7-3024b5db8512" containerID="e84d04a2fdff42c4d5d6844eaf4e32d6562610178581a08f7812e91c8f66191a" exitCode=0 Jan 10 16:28:56 crc kubenswrapper[5036]: I0110 16:28:56.925095 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" event={"ID":"792beb3d-c532-4c80-8ab7-3024b5db8512","Type":"ContainerDied","Data":"e84d04a2fdff42c4d5d6844eaf4e32d6562610178581a08f7812e91c8f66191a"} Jan 10 16:29:00 crc kubenswrapper[5036]: I0110 16:29:00.819961 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" Jan 10 16:29:00 crc kubenswrapper[5036]: I0110 16:29:00.821961 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 10 16:29:00 crc kubenswrapper[5036]: I0110 16:29:00.827786 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" Jan 10 16:29:00 crc kubenswrapper[5036]: I0110 16:29:00.872481 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2"] Jan 10 16:29:00 crc kubenswrapper[5036]: E0110 16:29:00.873005 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddbb1e96-d05d-43be-b0f5-7d5e01d707a3" containerName="pruner" Jan 10 16:29:00 crc kubenswrapper[5036]: I0110 16:29:00.873032 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddbb1e96-d05d-43be-b0f5-7d5e01d707a3" containerName="pruner" Jan 10 16:29:00 crc kubenswrapper[5036]: E0110 16:29:00.873073 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5ea287e-5a20-4798-8f4b-4f2d0e5b1581" containerName="route-controller-manager" Jan 10 16:29:00 crc kubenswrapper[5036]: I0110 16:29:00.873085 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5ea287e-5a20-4798-8f4b-4f2d0e5b1581" containerName="route-controller-manager" Jan 10 16:29:00 crc kubenswrapper[5036]: E0110 16:29:00.873102 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="792beb3d-c532-4c80-8ab7-3024b5db8512" containerName="controller-manager" Jan 10 16:29:00 crc kubenswrapper[5036]: I0110 16:29:00.873113 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="792beb3d-c532-4c80-8ab7-3024b5db8512" containerName="controller-manager" Jan 10 16:29:00 crc kubenswrapper[5036]: I0110 16:29:00.873256 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="792beb3d-c532-4c80-8ab7-3024b5db8512" containerName="controller-manager" Jan 10 16:29:00 crc kubenswrapper[5036]: I0110 16:29:00.873270 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddbb1e96-d05d-43be-b0f5-7d5e01d707a3" containerName="pruner" Jan 10 16:29:00 crc kubenswrapper[5036]: I0110 16:29:00.873286 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5ea287e-5a20-4798-8f4b-4f2d0e5b1581" containerName="route-controller-manager" Jan 10 16:29:00 crc kubenswrapper[5036]: I0110 16:29:00.873811 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2" Jan 10 16:29:00 crc kubenswrapper[5036]: I0110 16:29:00.888898 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2"] Jan 10 16:29:00 crc kubenswrapper[5036]: I0110 16:29:00.923940 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d96da61f-255c-4bef-9b4b-6c7c34379cd9-client-ca\") pod \"route-controller-manager-67b98d4d88-2z2l2\" (UID: \"d96da61f-255c-4bef-9b4b-6c7c34379cd9\") " pod="openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2" Jan 10 16:29:00 crc kubenswrapper[5036]: I0110 16:29:00.924082 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm2tl\" (UniqueName: \"kubernetes.io/projected/d96da61f-255c-4bef-9b4b-6c7c34379cd9-kube-api-access-pm2tl\") pod \"route-controller-manager-67b98d4d88-2z2l2\" (UID: \"d96da61f-255c-4bef-9b4b-6c7c34379cd9\") " pod="openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2" Jan 10 16:29:00 crc kubenswrapper[5036]: I0110 16:29:00.924133 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d96da61f-255c-4bef-9b4b-6c7c34379cd9-config\") pod \"route-controller-manager-67b98d4d88-2z2l2\" (UID: \"d96da61f-255c-4bef-9b4b-6c7c34379cd9\") " pod="openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2" Jan 10 16:29:00 crc kubenswrapper[5036]: I0110 16:29:00.924191 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d96da61f-255c-4bef-9b4b-6c7c34379cd9-serving-cert\") pod \"route-controller-manager-67b98d4d88-2z2l2\" (UID: \"d96da61f-255c-4bef-9b4b-6c7c34379cd9\") " pod="openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2" Jan 10 16:29:00 crc kubenswrapper[5036]: I0110 16:29:00.955869 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" event={"ID":"e5ea287e-5a20-4798-8f4b-4f2d0e5b1581","Type":"ContainerDied","Data":"dfd8ebc1df64a9bb06bb11143df10f701fe3e2621e0c012af0da224fe04c2a1f"} Jan 10 16:29:00 crc kubenswrapper[5036]: I0110 16:29:00.955976 5036 scope.go:117] "RemoveContainer" containerID="f2e43ef035148b716424bf58d83371843de23f90899e03d9effeaa90ce4e3c37" Jan 10 16:29:00 crc kubenswrapper[5036]: I0110 16:29:00.956085 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd" Jan 10 16:29:00 crc kubenswrapper[5036]: I0110 16:29:00.962517 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" Jan 10 16:29:00 crc kubenswrapper[5036]: I0110 16:29:00.962517 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9r9hf" event={"ID":"792beb3d-c532-4c80-8ab7-3024b5db8512","Type":"ContainerDied","Data":"c53490abff74cec79201ea092b68350d6df5056bbfbb61e7ff7e66b78dcde75d"} Jan 10 16:29:00 crc kubenswrapper[5036]: I0110 16:29:00.978594 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ddbb1e96-d05d-43be-b0f5-7d5e01d707a3","Type":"ContainerDied","Data":"cd8fbe9f32377925e8a82c326667d3e8753e68270a02862a151f3eb9d94dacd6"} Jan 10 16:29:00 crc kubenswrapper[5036]: I0110 16:29:00.978644 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd8fbe9f32377925e8a82c326667d3e8753e68270a02862a151f3eb9d94dacd6" Jan 10 16:29:00 crc kubenswrapper[5036]: I0110 16:29:00.978709 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.024879 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmdnp\" (UniqueName: \"kubernetes.io/projected/792beb3d-c532-4c80-8ab7-3024b5db8512-kube-api-access-pmdnp\") pod \"792beb3d-c532-4c80-8ab7-3024b5db8512\" (UID: \"792beb3d-c532-4c80-8ab7-3024b5db8512\") " Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.024949 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/792beb3d-c532-4c80-8ab7-3024b5db8512-serving-cert\") pod \"792beb3d-c532-4c80-8ab7-3024b5db8512\" (UID: \"792beb3d-c532-4c80-8ab7-3024b5db8512\") " Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.024993 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ddbb1e96-d05d-43be-b0f5-7d5e01d707a3-kubelet-dir\") pod \"ddbb1e96-d05d-43be-b0f5-7d5e01d707a3\" (UID: \"ddbb1e96-d05d-43be-b0f5-7d5e01d707a3\") " Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.025021 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvnkg\" (UniqueName: \"kubernetes.io/projected/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581-kube-api-access-mvnkg\") pod \"e5ea287e-5a20-4798-8f4b-4f2d0e5b1581\" (UID: \"e5ea287e-5a20-4798-8f4b-4f2d0e5b1581\") " Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.025051 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ddbb1e96-d05d-43be-b0f5-7d5e01d707a3-kube-api-access\") pod \"ddbb1e96-d05d-43be-b0f5-7d5e01d707a3\" (UID: \"ddbb1e96-d05d-43be-b0f5-7d5e01d707a3\") " Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.025083 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/792beb3d-c532-4c80-8ab7-3024b5db8512-proxy-ca-bundles\") pod \"792beb3d-c532-4c80-8ab7-3024b5db8512\" (UID: \"792beb3d-c532-4c80-8ab7-3024b5db8512\") " Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.025146 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581-config\") pod \"e5ea287e-5a20-4798-8f4b-4f2d0e5b1581\" (UID: \"e5ea287e-5a20-4798-8f4b-4f2d0e5b1581\") " Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.025168 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581-client-ca\") pod \"e5ea287e-5a20-4798-8f4b-4f2d0e5b1581\" (UID: \"e5ea287e-5a20-4798-8f4b-4f2d0e5b1581\") " Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.025198 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581-serving-cert\") pod \"e5ea287e-5a20-4798-8f4b-4f2d0e5b1581\" (UID: \"e5ea287e-5a20-4798-8f4b-4f2d0e5b1581\") " Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.025228 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/792beb3d-c532-4c80-8ab7-3024b5db8512-client-ca\") pod \"792beb3d-c532-4c80-8ab7-3024b5db8512\" (UID: \"792beb3d-c532-4c80-8ab7-3024b5db8512\") " Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.025266 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/792beb3d-c532-4c80-8ab7-3024b5db8512-config\") pod \"792beb3d-c532-4c80-8ab7-3024b5db8512\" (UID: \"792beb3d-c532-4c80-8ab7-3024b5db8512\") " Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.025350 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d96da61f-255c-4bef-9b4b-6c7c34379cd9-serving-cert\") pod \"route-controller-manager-67b98d4d88-2z2l2\" (UID: \"d96da61f-255c-4bef-9b4b-6c7c34379cd9\") " pod="openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.025415 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d96da61f-255c-4bef-9b4b-6c7c34379cd9-client-ca\") pod \"route-controller-manager-67b98d4d88-2z2l2\" (UID: \"d96da61f-255c-4bef-9b4b-6c7c34379cd9\") " pod="openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.025446 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm2tl\" (UniqueName: \"kubernetes.io/projected/d96da61f-255c-4bef-9b4b-6c7c34379cd9-kube-api-access-pm2tl\") pod \"route-controller-manager-67b98d4d88-2z2l2\" (UID: \"d96da61f-255c-4bef-9b4b-6c7c34379cd9\") " pod="openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.025472 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d96da61f-255c-4bef-9b4b-6c7c34379cd9-config\") pod \"route-controller-manager-67b98d4d88-2z2l2\" (UID: \"d96da61f-255c-4bef-9b4b-6c7c34379cd9\") " pod="openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.025774 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ddbb1e96-d05d-43be-b0f5-7d5e01d707a3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ddbb1e96-d05d-43be-b0f5-7d5e01d707a3" (UID: "ddbb1e96-d05d-43be-b0f5-7d5e01d707a3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.026860 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/792beb3d-c532-4c80-8ab7-3024b5db8512-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "792beb3d-c532-4c80-8ab7-3024b5db8512" (UID: "792beb3d-c532-4c80-8ab7-3024b5db8512"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.026863 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581-client-ca" (OuterVolumeSpecName: "client-ca") pod "e5ea287e-5a20-4798-8f4b-4f2d0e5b1581" (UID: "e5ea287e-5a20-4798-8f4b-4f2d0e5b1581"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.026914 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d96da61f-255c-4bef-9b4b-6c7c34379cd9-client-ca\") pod \"route-controller-manager-67b98d4d88-2z2l2\" (UID: \"d96da61f-255c-4bef-9b4b-6c7c34379cd9\") " pod="openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.026988 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/792beb3d-c532-4c80-8ab7-3024b5db8512-config" (OuterVolumeSpecName: "config") pod "792beb3d-c532-4c80-8ab7-3024b5db8512" (UID: "792beb3d-c532-4c80-8ab7-3024b5db8512"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.027024 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/792beb3d-c532-4c80-8ab7-3024b5db8512-client-ca" (OuterVolumeSpecName: "client-ca") pod "792beb3d-c532-4c80-8ab7-3024b5db8512" (UID: "792beb3d-c532-4c80-8ab7-3024b5db8512"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.027017 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581-config" (OuterVolumeSpecName: "config") pod "e5ea287e-5a20-4798-8f4b-4f2d0e5b1581" (UID: "e5ea287e-5a20-4798-8f4b-4f2d0e5b1581"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.027908 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d96da61f-255c-4bef-9b4b-6c7c34379cd9-config\") pod \"route-controller-manager-67b98d4d88-2z2l2\" (UID: \"d96da61f-255c-4bef-9b4b-6c7c34379cd9\") " pod="openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.032812 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d96da61f-255c-4bef-9b4b-6c7c34379cd9-serving-cert\") pod \"route-controller-manager-67b98d4d88-2z2l2\" (UID: \"d96da61f-255c-4bef-9b4b-6c7c34379cd9\") " pod="openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.033025 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddbb1e96-d05d-43be-b0f5-7d5e01d707a3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ddbb1e96-d05d-43be-b0f5-7d5e01d707a3" (UID: "ddbb1e96-d05d-43be-b0f5-7d5e01d707a3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.033027 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e5ea287e-5a20-4798-8f4b-4f2d0e5b1581" (UID: "e5ea287e-5a20-4798-8f4b-4f2d0e5b1581"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.034104 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581-kube-api-access-mvnkg" (OuterVolumeSpecName: "kube-api-access-mvnkg") pod "e5ea287e-5a20-4798-8f4b-4f2d0e5b1581" (UID: "e5ea287e-5a20-4798-8f4b-4f2d0e5b1581"). InnerVolumeSpecName "kube-api-access-mvnkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.034099 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/792beb3d-c532-4c80-8ab7-3024b5db8512-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "792beb3d-c532-4c80-8ab7-3024b5db8512" (UID: "792beb3d-c532-4c80-8ab7-3024b5db8512"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.042253 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/792beb3d-c532-4c80-8ab7-3024b5db8512-kube-api-access-pmdnp" (OuterVolumeSpecName: "kube-api-access-pmdnp") pod "792beb3d-c532-4c80-8ab7-3024b5db8512" (UID: "792beb3d-c532-4c80-8ab7-3024b5db8512"). InnerVolumeSpecName "kube-api-access-pmdnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.044578 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm2tl\" (UniqueName: \"kubernetes.io/projected/d96da61f-255c-4bef-9b4b-6c7c34379cd9-kube-api-access-pm2tl\") pod \"route-controller-manager-67b98d4d88-2z2l2\" (UID: \"d96da61f-255c-4bef-9b4b-6c7c34379cd9\") " pod="openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.126014 5036 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/792beb3d-c532-4c80-8ab7-3024b5db8512-client-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.126048 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/792beb3d-c532-4c80-8ab7-3024b5db8512-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.126058 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmdnp\" (UniqueName: \"kubernetes.io/projected/792beb3d-c532-4c80-8ab7-3024b5db8512-kube-api-access-pmdnp\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.126069 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/792beb3d-c532-4c80-8ab7-3024b5db8512-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.126077 5036 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ddbb1e96-d05d-43be-b0f5-7d5e01d707a3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.126087 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvnkg\" (UniqueName: \"kubernetes.io/projected/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581-kube-api-access-mvnkg\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.126096 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ddbb1e96-d05d-43be-b0f5-7d5e01d707a3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.126107 5036 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/792beb3d-c532-4c80-8ab7-3024b5db8512-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.126116 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.126126 5036 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581-client-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.126134 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.214304 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2" Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.297971 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd"] Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.302725 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b4wwd"] Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.317901 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9r9hf"] Jan 10 16:29:01 crc kubenswrapper[5036]: I0110 16:29:01.322731 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9r9hf"] Jan 10 16:29:02 crc kubenswrapper[5036]: I0110 16:29:02.527174 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="792beb3d-c532-4c80-8ab7-3024b5db8512" path="/var/lib/kubelet/pods/792beb3d-c532-4c80-8ab7-3024b5db8512/volumes" Jan 10 16:29:02 crc kubenswrapper[5036]: I0110 16:29:02.528269 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5ea287e-5a20-4798-8f4b-4f2d0e5b1581" path="/var/lib/kubelet/pods/e5ea287e-5a20-4798-8f4b-4f2d0e5b1581/volumes" Jan 10 16:29:03 crc kubenswrapper[5036]: I0110 16:29:03.045599 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:29:03 crc kubenswrapper[5036]: I0110 16:29:03.772002 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-887496984-fshvp"] Jan 10 16:29:03 crc kubenswrapper[5036]: I0110 16:29:03.774212 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-887496984-fshvp" Jan 10 16:29:03 crc kubenswrapper[5036]: I0110 16:29:03.777349 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 10 16:29:03 crc kubenswrapper[5036]: I0110 16:29:03.777540 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 10 16:29:03 crc kubenswrapper[5036]: I0110 16:29:03.778123 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-887496984-fshvp"] Jan 10 16:29:03 crc kubenswrapper[5036]: I0110 16:29:03.778406 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 10 16:29:03 crc kubenswrapper[5036]: I0110 16:29:03.778791 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 10 16:29:03 crc kubenswrapper[5036]: I0110 16:29:03.779061 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 10 16:29:03 crc kubenswrapper[5036]: I0110 16:29:03.780013 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 10 16:29:03 crc kubenswrapper[5036]: I0110 16:29:03.785620 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 10 16:29:03 crc kubenswrapper[5036]: I0110 16:29:03.966621 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a96d8194-525f-4ac5-98d4-bd04940a23e3-proxy-ca-bundles\") pod \"controller-manager-887496984-fshvp\" (UID: \"a96d8194-525f-4ac5-98d4-bd04940a23e3\") " pod="openshift-controller-manager/controller-manager-887496984-fshvp" Jan 10 16:29:03 crc kubenswrapper[5036]: I0110 16:29:03.966713 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a96d8194-525f-4ac5-98d4-bd04940a23e3-config\") pod \"controller-manager-887496984-fshvp\" (UID: \"a96d8194-525f-4ac5-98d4-bd04940a23e3\") " pod="openshift-controller-manager/controller-manager-887496984-fshvp" Jan 10 16:29:03 crc kubenswrapper[5036]: I0110 16:29:03.966827 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw6xc\" (UniqueName: \"kubernetes.io/projected/a96d8194-525f-4ac5-98d4-bd04940a23e3-kube-api-access-kw6xc\") pod \"controller-manager-887496984-fshvp\" (UID: \"a96d8194-525f-4ac5-98d4-bd04940a23e3\") " pod="openshift-controller-manager/controller-manager-887496984-fshvp" Jan 10 16:29:03 crc kubenswrapper[5036]: I0110 16:29:03.967122 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a96d8194-525f-4ac5-98d4-bd04940a23e3-client-ca\") pod \"controller-manager-887496984-fshvp\" (UID: \"a96d8194-525f-4ac5-98d4-bd04940a23e3\") " pod="openshift-controller-manager/controller-manager-887496984-fshvp" Jan 10 16:29:03 crc kubenswrapper[5036]: I0110 16:29:03.967218 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a96d8194-525f-4ac5-98d4-bd04940a23e3-serving-cert\") pod \"controller-manager-887496984-fshvp\" (UID: \"a96d8194-525f-4ac5-98d4-bd04940a23e3\") " pod="openshift-controller-manager/controller-manager-887496984-fshvp" Jan 10 16:29:04 crc kubenswrapper[5036]: I0110 16:29:04.068776 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw6xc\" (UniqueName: \"kubernetes.io/projected/a96d8194-525f-4ac5-98d4-bd04940a23e3-kube-api-access-kw6xc\") pod \"controller-manager-887496984-fshvp\" (UID: \"a96d8194-525f-4ac5-98d4-bd04940a23e3\") " pod="openshift-controller-manager/controller-manager-887496984-fshvp" Jan 10 16:29:04 crc kubenswrapper[5036]: I0110 16:29:04.068881 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a96d8194-525f-4ac5-98d4-bd04940a23e3-client-ca\") pod \"controller-manager-887496984-fshvp\" (UID: \"a96d8194-525f-4ac5-98d4-bd04940a23e3\") " pod="openshift-controller-manager/controller-manager-887496984-fshvp" Jan 10 16:29:04 crc kubenswrapper[5036]: I0110 16:29:04.068904 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a96d8194-525f-4ac5-98d4-bd04940a23e3-serving-cert\") pod \"controller-manager-887496984-fshvp\" (UID: \"a96d8194-525f-4ac5-98d4-bd04940a23e3\") " pod="openshift-controller-manager/controller-manager-887496984-fshvp" Jan 10 16:29:04 crc kubenswrapper[5036]: I0110 16:29:04.068935 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a96d8194-525f-4ac5-98d4-bd04940a23e3-proxy-ca-bundles\") pod \"controller-manager-887496984-fshvp\" (UID: \"a96d8194-525f-4ac5-98d4-bd04940a23e3\") " pod="openshift-controller-manager/controller-manager-887496984-fshvp" Jan 10 16:29:04 crc kubenswrapper[5036]: I0110 16:29:04.068955 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a96d8194-525f-4ac5-98d4-bd04940a23e3-config\") pod \"controller-manager-887496984-fshvp\" (UID: \"a96d8194-525f-4ac5-98d4-bd04940a23e3\") " pod="openshift-controller-manager/controller-manager-887496984-fshvp" Jan 10 16:29:04 crc kubenswrapper[5036]: I0110 16:29:04.071015 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a96d8194-525f-4ac5-98d4-bd04940a23e3-proxy-ca-bundles\") pod \"controller-manager-887496984-fshvp\" (UID: \"a96d8194-525f-4ac5-98d4-bd04940a23e3\") " pod="openshift-controller-manager/controller-manager-887496984-fshvp" Jan 10 16:29:04 crc kubenswrapper[5036]: I0110 16:29:04.071215 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a96d8194-525f-4ac5-98d4-bd04940a23e3-config\") pod \"controller-manager-887496984-fshvp\" (UID: \"a96d8194-525f-4ac5-98d4-bd04940a23e3\") " pod="openshift-controller-manager/controller-manager-887496984-fshvp" Jan 10 16:29:04 crc kubenswrapper[5036]: I0110 16:29:04.071298 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a96d8194-525f-4ac5-98d4-bd04940a23e3-client-ca\") pod \"controller-manager-887496984-fshvp\" (UID: \"a96d8194-525f-4ac5-98d4-bd04940a23e3\") " pod="openshift-controller-manager/controller-manager-887496984-fshvp" Jan 10 16:29:04 crc kubenswrapper[5036]: I0110 16:29:04.073938 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a96d8194-525f-4ac5-98d4-bd04940a23e3-serving-cert\") pod \"controller-manager-887496984-fshvp\" (UID: \"a96d8194-525f-4ac5-98d4-bd04940a23e3\") " pod="openshift-controller-manager/controller-manager-887496984-fshvp" Jan 10 16:29:04 crc kubenswrapper[5036]: I0110 16:29:04.085908 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw6xc\" (UniqueName: \"kubernetes.io/projected/a96d8194-525f-4ac5-98d4-bd04940a23e3-kube-api-access-kw6xc\") pod \"controller-manager-887496984-fshvp\" (UID: \"a96d8194-525f-4ac5-98d4-bd04940a23e3\") " pod="openshift-controller-manager/controller-manager-887496984-fshvp" Jan 10 16:29:04 crc kubenswrapper[5036]: I0110 16:29:04.100569 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-887496984-fshvp" Jan 10 16:29:04 crc kubenswrapper[5036]: I0110 16:29:04.541804 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:29:04 crc kubenswrapper[5036]: I0110 16:29:04.546276 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:29:04 crc kubenswrapper[5036]: E0110 16:29:04.695128 5036 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 10 16:29:04 crc kubenswrapper[5036]: E0110 16:29:04.696728 5036 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 10 16:29:04 crc kubenswrapper[5036]: E0110 16:29:04.697906 5036 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 10 16:29:04 crc kubenswrapper[5036]: E0110 16:29:04.697939 5036 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" podUID="56edcbe7-428c-4373-928d-b2fdf97a0a3a" containerName="kube-multus-additional-cni-plugins" Jan 10 16:29:11 crc kubenswrapper[5036]: I0110 16:29:11.529627 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 10 16:29:12 crc kubenswrapper[5036]: I0110 16:29:12.059349 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-lt5rc_56edcbe7-428c-4373-928d-b2fdf97a0a3a/kube-multus-additional-cni-plugins/0.log" Jan 10 16:29:12 crc kubenswrapper[5036]: I0110 16:29:12.059444 5036 generic.go:334] "Generic (PLEG): container finished" podID="56edcbe7-428c-4373-928d-b2fdf97a0a3a" containerID="a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62" exitCode=137 Jan 10 16:29:12 crc kubenswrapper[5036]: I0110 16:29:12.059578 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" event={"ID":"56edcbe7-428c-4373-928d-b2fdf97a0a3a","Type":"ContainerDied","Data":"a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62"} Jan 10 16:29:13 crc kubenswrapper[5036]: I0110 16:29:13.823889 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 10 16:29:13 crc kubenswrapper[5036]: I0110 16:29:13.860741 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=2.860708089 podStartE2EDuration="2.860708089s" podCreationTimestamp="2026-01-10 16:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:29:13.84018179 +0000 UTC m=+75.710417284" watchObservedRunningTime="2026-01-10 16:29:13.860708089 +0000 UTC m=+75.730943603" Jan 10 16:29:14 crc kubenswrapper[5036]: I0110 16:29:14.335310 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lr2qm" Jan 10 16:29:14 crc kubenswrapper[5036]: E0110 16:29:14.693539 5036 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62 is running failed: container process not found" containerID="a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 10 16:29:14 crc kubenswrapper[5036]: E0110 16:29:14.694792 5036 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62 is running failed: container process not found" containerID="a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 10 16:29:14 crc kubenswrapper[5036]: E0110 16:29:14.695491 5036 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62 is running failed: container process not found" containerID="a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 10 16:29:14 crc kubenswrapper[5036]: E0110 16:29:14.695553 5036 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" podUID="56edcbe7-428c-4373-928d-b2fdf97a0a3a" containerName="kube-multus-additional-cni-plugins" Jan 10 16:29:15 crc kubenswrapper[5036]: I0110 16:29:15.572359 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-887496984-fshvp"] Jan 10 16:29:15 crc kubenswrapper[5036]: I0110 16:29:15.593723 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2"] Jan 10 16:29:16 crc kubenswrapper[5036]: I0110 16:29:16.540170 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 10 16:29:17 crc kubenswrapper[5036]: E0110 16:29:17.085979 5036 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 10 16:29:17 crc kubenswrapper[5036]: E0110 16:29:17.086217 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vvk7n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-czknz_openshift-marketplace(3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 10 16:29:17 crc kubenswrapper[5036]: E0110 16:29:17.087513 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-czknz" podUID="3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9" Jan 10 16:29:17 crc kubenswrapper[5036]: I0110 16:29:17.199539 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.199505179 podStartE2EDuration="1.199505179s" podCreationTimestamp="2026-01-10 16:29:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:29:17.196947486 +0000 UTC m=+79.067182980" watchObservedRunningTime="2026-01-10 16:29:17.199505179 +0000 UTC m=+79.069740673" Jan 10 16:29:20 crc kubenswrapper[5036]: E0110 16:29:20.084644 5036 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 10 16:29:20 crc kubenswrapper[5036]: E0110 16:29:20.085037 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jhh59,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-9w55c_openshift-marketplace(0239b380-03c8-455e-a981-2aaaae000828): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 10 16:29:20 crc kubenswrapper[5036]: E0110 16:29:20.086378 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-9w55c" podUID="0239b380-03c8-455e-a981-2aaaae000828" Jan 10 16:29:23 crc kubenswrapper[5036]: I0110 16:29:23.161491 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 10 16:29:23 crc kubenswrapper[5036]: I0110 16:29:23.170980 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 10 16:29:23 crc kubenswrapper[5036]: I0110 16:29:23.171164 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 10 16:29:23 crc kubenswrapper[5036]: I0110 16:29:23.174183 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 10 16:29:23 crc kubenswrapper[5036]: I0110 16:29:23.174379 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 10 16:29:23 crc kubenswrapper[5036]: E0110 16:29:23.240359 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-czknz" podUID="3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9" Jan 10 16:29:23 crc kubenswrapper[5036]: E0110 16:29:23.240834 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-9w55c" podUID="0239b380-03c8-455e-a981-2aaaae000828" Jan 10 16:29:23 crc kubenswrapper[5036]: I0110 16:29:23.308553 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20042ba4-9f8c-48b8-85e4-97d8ea2ad51b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"20042ba4-9f8c-48b8-85e4-97d8ea2ad51b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 10 16:29:23 crc kubenswrapper[5036]: I0110 16:29:23.308616 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/20042ba4-9f8c-48b8-85e4-97d8ea2ad51b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"20042ba4-9f8c-48b8-85e4-97d8ea2ad51b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 10 16:29:23 crc kubenswrapper[5036]: I0110 16:29:23.409537 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20042ba4-9f8c-48b8-85e4-97d8ea2ad51b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"20042ba4-9f8c-48b8-85e4-97d8ea2ad51b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 10 16:29:23 crc kubenswrapper[5036]: I0110 16:29:23.409622 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/20042ba4-9f8c-48b8-85e4-97d8ea2ad51b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"20042ba4-9f8c-48b8-85e4-97d8ea2ad51b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 10 16:29:23 crc kubenswrapper[5036]: I0110 16:29:23.409825 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/20042ba4-9f8c-48b8-85e4-97d8ea2ad51b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"20042ba4-9f8c-48b8-85e4-97d8ea2ad51b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 10 16:29:23 crc kubenswrapper[5036]: I0110 16:29:23.444002 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20042ba4-9f8c-48b8-85e4-97d8ea2ad51b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"20042ba4-9f8c-48b8-85e4-97d8ea2ad51b\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 10 16:29:23 crc kubenswrapper[5036]: I0110 16:29:23.498980 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 10 16:29:23 crc kubenswrapper[5036]: E0110 16:29:23.738845 5036 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 10 16:29:23 crc kubenswrapper[5036]: E0110 16:29:23.739403 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mh2wk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-q72r7_openshift-marketplace(fe3cdeec-7336-463c-bbbb-488ece81fa0b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 10 16:29:23 crc kubenswrapper[5036]: E0110 16:29:23.743872 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-q72r7" podUID="fe3cdeec-7336-463c-bbbb-488ece81fa0b" Jan 10 16:29:24 crc kubenswrapper[5036]: E0110 16:29:24.693119 5036 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62 is running failed: container process not found" containerID="a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 10 16:29:24 crc kubenswrapper[5036]: E0110 16:29:24.693737 5036 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62 is running failed: container process not found" containerID="a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 10 16:29:24 crc kubenswrapper[5036]: E0110 16:29:24.694081 5036 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62 is running failed: container process not found" containerID="a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 10 16:29:24 crc kubenswrapper[5036]: E0110 16:29:24.694143 5036 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" podUID="56edcbe7-428c-4373-928d-b2fdf97a0a3a" containerName="kube-multus-additional-cni-plugins" Jan 10 16:29:24 crc kubenswrapper[5036]: E0110 16:29:24.987524 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-q72r7" podUID="fe3cdeec-7336-463c-bbbb-488ece81fa0b" Jan 10 16:29:25 crc kubenswrapper[5036]: E0110 16:29:25.039970 5036 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 10 16:29:25 crc kubenswrapper[5036]: E0110 16:29:25.040314 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-grqzm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-v2wzb_openshift-marketplace(9f0cd226-9f92-4ef2-a82b-7746983ab42e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 10 16:29:25 crc kubenswrapper[5036]: E0110 16:29:25.041515 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-v2wzb" podUID="9f0cd226-9f92-4ef2-a82b-7746983ab42e" Jan 10 16:29:26 crc kubenswrapper[5036]: E0110 16:29:26.401669 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-v2wzb" podUID="9f0cd226-9f92-4ef2-a82b-7746983ab42e" Jan 10 16:29:26 crc kubenswrapper[5036]: I0110 16:29:26.413012 5036 scope.go:117] "RemoveContainer" containerID="e84d04a2fdff42c4d5d6844eaf4e32d6562610178581a08f7812e91c8f66191a" Jan 10 16:29:26 crc kubenswrapper[5036]: E0110 16:29:26.490038 5036 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 10 16:29:26 crc kubenswrapper[5036]: E0110 16:29:26.490326 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wq8hl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-lsfx8_openshift-marketplace(ea0d5867-9889-49bd-b23e-545606295a7a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 10 16:29:26 crc kubenswrapper[5036]: E0110 16:29:26.491977 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-lsfx8" podUID="ea0d5867-9889-49bd-b23e-545606295a7a" Jan 10 16:29:26 crc kubenswrapper[5036]: I0110 16:29:26.509677 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-lt5rc_56edcbe7-428c-4373-928d-b2fdf97a0a3a/kube-multus-additional-cni-plugins/0.log" Jan 10 16:29:26 crc kubenswrapper[5036]: I0110 16:29:26.510192 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" Jan 10 16:29:26 crc kubenswrapper[5036]: E0110 16:29:26.549139 5036 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 10 16:29:26 crc kubenswrapper[5036]: E0110 16:29:26.549351 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wx69v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-vzvbk_openshift-marketplace(1efe898b-dc49-41f9-a296-84f826548896): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 10 16:29:26 crc kubenswrapper[5036]: E0110 16:29:26.551599 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-vzvbk" podUID="1efe898b-dc49-41f9-a296-84f826548896" Jan 10 16:29:26 crc kubenswrapper[5036]: E0110 16:29:26.560935 5036 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 10 16:29:26 crc kubenswrapper[5036]: E0110 16:29:26.561109 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7q7f7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-hm8ns_openshift-marketplace(1513baef-e92c-4399-ae0f-b8fe4a738702): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 10 16:29:26 crc kubenswrapper[5036]: E0110 16:29:26.562261 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-hm8ns" podUID="1513baef-e92c-4399-ae0f-b8fe4a738702" Jan 10 16:29:26 crc kubenswrapper[5036]: E0110 16:29:26.568860 5036 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 10 16:29:26 crc kubenswrapper[5036]: E0110 16:29:26.569057 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mrjbz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-8qz7l_openshift-marketplace(963d9e81-5aca-4e34-b326-ffb47bcf98ba): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 10 16:29:26 crc kubenswrapper[5036]: E0110 16:29:26.571899 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-8qz7l" podUID="963d9e81-5aca-4e34-b326-ffb47bcf98ba" Jan 10 16:29:26 crc kubenswrapper[5036]: I0110 16:29:26.654484 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/56edcbe7-428c-4373-928d-b2fdf97a0a3a-tuning-conf-dir\") pod \"56edcbe7-428c-4373-928d-b2fdf97a0a3a\" (UID: \"56edcbe7-428c-4373-928d-b2fdf97a0a3a\") " Jan 10 16:29:26 crc kubenswrapper[5036]: I0110 16:29:26.654542 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4mkt\" (UniqueName: \"kubernetes.io/projected/56edcbe7-428c-4373-928d-b2fdf97a0a3a-kube-api-access-h4mkt\") pod \"56edcbe7-428c-4373-928d-b2fdf97a0a3a\" (UID: \"56edcbe7-428c-4373-928d-b2fdf97a0a3a\") " Jan 10 16:29:26 crc kubenswrapper[5036]: I0110 16:29:26.654641 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/56edcbe7-428c-4373-928d-b2fdf97a0a3a-ready\") pod \"56edcbe7-428c-4373-928d-b2fdf97a0a3a\" (UID: \"56edcbe7-428c-4373-928d-b2fdf97a0a3a\") " Jan 10 16:29:26 crc kubenswrapper[5036]: I0110 16:29:26.654676 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/56edcbe7-428c-4373-928d-b2fdf97a0a3a-cni-sysctl-allowlist\") pod \"56edcbe7-428c-4373-928d-b2fdf97a0a3a\" (UID: \"56edcbe7-428c-4373-928d-b2fdf97a0a3a\") " Jan 10 16:29:26 crc kubenswrapper[5036]: I0110 16:29:26.655209 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56edcbe7-428c-4373-928d-b2fdf97a0a3a-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "56edcbe7-428c-4373-928d-b2fdf97a0a3a" (UID: "56edcbe7-428c-4373-928d-b2fdf97a0a3a"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:29:26 crc kubenswrapper[5036]: I0110 16:29:26.656816 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56edcbe7-428c-4373-928d-b2fdf97a0a3a-ready" (OuterVolumeSpecName: "ready") pod "56edcbe7-428c-4373-928d-b2fdf97a0a3a" (UID: "56edcbe7-428c-4373-928d-b2fdf97a0a3a"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:29:26 crc kubenswrapper[5036]: I0110 16:29:26.656997 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56edcbe7-428c-4373-928d-b2fdf97a0a3a-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "56edcbe7-428c-4373-928d-b2fdf97a0a3a" (UID: "56edcbe7-428c-4373-928d-b2fdf97a0a3a"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:29:26 crc kubenswrapper[5036]: I0110 16:29:26.669450 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56edcbe7-428c-4373-928d-b2fdf97a0a3a-kube-api-access-h4mkt" (OuterVolumeSpecName: "kube-api-access-h4mkt") pod "56edcbe7-428c-4373-928d-b2fdf97a0a3a" (UID: "56edcbe7-428c-4373-928d-b2fdf97a0a3a"). InnerVolumeSpecName "kube-api-access-h4mkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:29:26 crc kubenswrapper[5036]: I0110 16:29:26.756564 5036 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/56edcbe7-428c-4373-928d-b2fdf97a0a3a-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:26 crc kubenswrapper[5036]: I0110 16:29:26.756606 5036 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/56edcbe7-428c-4373-928d-b2fdf97a0a3a-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:26 crc kubenswrapper[5036]: I0110 16:29:26.756618 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4mkt\" (UniqueName: \"kubernetes.io/projected/56edcbe7-428c-4373-928d-b2fdf97a0a3a-kube-api-access-h4mkt\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:26 crc kubenswrapper[5036]: I0110 16:29:26.756629 5036 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/56edcbe7-428c-4373-928d-b2fdf97a0a3a-ready\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:26 crc kubenswrapper[5036]: I0110 16:29:26.832212 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-887496984-fshvp"] Jan 10 16:29:26 crc kubenswrapper[5036]: I0110 16:29:26.885168 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 10 16:29:26 crc kubenswrapper[5036]: W0110 16:29:26.945543 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd96da61f_255c_4bef_9b4b_6c7c34379cd9.slice/crio-c1adcf93fd5a2faaf6d99a3efe28f2c806c361aaf0e6129e48c80a5e81dedf75 WatchSource:0}: Error finding container c1adcf93fd5a2faaf6d99a3efe28f2c806c361aaf0e6129e48c80a5e81dedf75: Status 404 returned error can't find the container with id c1adcf93fd5a2faaf6d99a3efe28f2c806c361aaf0e6129e48c80a5e81dedf75 Jan 10 16:29:26 crc kubenswrapper[5036]: I0110 16:29:26.946957 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2"] Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.155935 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 10 16:29:27 crc kubenswrapper[5036]: E0110 16:29:27.156544 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56edcbe7-428c-4373-928d-b2fdf97a0a3a" containerName="kube-multus-additional-cni-plugins" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.156558 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="56edcbe7-428c-4373-928d-b2fdf97a0a3a" containerName="kube-multus-additional-cni-plugins" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.156667 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="56edcbe7-428c-4373-928d-b2fdf97a0a3a" containerName="kube-multus-additional-cni-plugins" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.157915 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.169274 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.221528 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2" event={"ID":"d96da61f-255c-4bef-9b4b-6c7c34379cd9","Type":"ContainerStarted","Data":"c6cc19b055fff3fc65b8f0e608e340d08f821eee241b1ebab7a81780a04c4a26"} Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.221585 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2" event={"ID":"d96da61f-255c-4bef-9b4b-6c7c34379cd9","Type":"ContainerStarted","Data":"c1adcf93fd5a2faaf6d99a3efe28f2c806c361aaf0e6129e48c80a5e81dedf75"} Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.221733 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2" podUID="d96da61f-255c-4bef-9b4b-6c7c34379cd9" containerName="route-controller-manager" containerID="cri-o://c6cc19b055fff3fc65b8f0e608e340d08f821eee241b1ebab7a81780a04c4a26" gracePeriod=30 Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.222654 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.234070 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-887496984-fshvp" event={"ID":"a96d8194-525f-4ac5-98d4-bd04940a23e3","Type":"ContainerStarted","Data":"40856f0a54202bae9f97c7c8fc907b83bead7a3bde8258f2a65c8c2b5ebf9021"} Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.234124 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-887496984-fshvp" event={"ID":"a96d8194-525f-4ac5-98d4-bd04940a23e3","Type":"ContainerStarted","Data":"28c0d3f1ccf17d8bd352126f513cf01e1e5e8713b73ab8412f512ed30fe34770"} Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.234576 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-887496984-fshvp" podUID="a96d8194-525f-4ac5-98d4-bd04940a23e3" containerName="controller-manager" containerID="cri-o://40856f0a54202bae9f97c7c8fc907b83bead7a3bde8258f2a65c8c2b5ebf9021" gracePeriod=30 Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.235475 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-887496984-fshvp" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.240173 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-lt5rc_56edcbe7-428c-4373-928d-b2fdf97a0a3a/kube-multus-additional-cni-plugins/0.log" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.240374 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" event={"ID":"56edcbe7-428c-4373-928d-b2fdf97a0a3a","Type":"ContainerDied","Data":"deb8a8d82bea9e19965b37e9d81eb86e91935c109a6478df9ca8bff7ff21cd6d"} Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.240516 5036 scope.go:117] "RemoveContainer" containerID="a75171775087a41ce797603dbbc0b45bf1b99fa4fba458edee24dd3c38070f62" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.240782 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-lt5rc" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.255964 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"20042ba4-9f8c-48b8-85e4-97d8ea2ad51b","Type":"ContainerStarted","Data":"eeb29231db70a51aa59c07b17ce54dbe5a15bc439f566873147e38bfd529856f"} Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.256220 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2" podStartSLOduration=32.256203386 podStartE2EDuration="32.256203386s" podCreationTimestamp="2026-01-10 16:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:29:27.251985765 +0000 UTC m=+89.122221269" watchObservedRunningTime="2026-01-10 16:29:27.256203386 +0000 UTC m=+89.126438890" Jan 10 16:29:27 crc kubenswrapper[5036]: E0110 16:29:27.261270 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-lsfx8" podUID="ea0d5867-9889-49bd-b23e-545606295a7a" Jan 10 16:29:27 crc kubenswrapper[5036]: E0110 16:29:27.261674 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-8qz7l" podUID="963d9e81-5aca-4e34-b326-ffb47bcf98ba" Jan 10 16:29:27 crc kubenswrapper[5036]: E0110 16:29:27.261736 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-hm8ns" podUID="1513baef-e92c-4399-ae0f-b8fe4a738702" Jan 10 16:29:27 crc kubenswrapper[5036]: E0110 16:29:27.261770 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-vzvbk" podUID="1efe898b-dc49-41f9-a296-84f826548896" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.264006 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5896d91a-6760-4d04-86ac-c45a6da0fa45-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5896d91a-6760-4d04-86ac-c45a6da0fa45\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.264044 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5896d91a-6760-4d04-86ac-c45a6da0fa45-kube-api-access\") pod \"installer-9-crc\" (UID: \"5896d91a-6760-4d04-86ac-c45a6da0fa45\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.264067 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5896d91a-6760-4d04-86ac-c45a6da0fa45-var-lock\") pod \"installer-9-crc\" (UID: \"5896d91a-6760-4d04-86ac-c45a6da0fa45\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.272384 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-887496984-fshvp" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.302491 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-887496984-fshvp" podStartSLOduration=32.302471853 podStartE2EDuration="32.302471853s" podCreationTimestamp="2026-01-10 16:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:29:27.289255054 +0000 UTC m=+89.159490558" watchObservedRunningTime="2026-01-10 16:29:27.302471853 +0000 UTC m=+89.172707347" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.306213 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-lt5rc"] Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.312016 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-lt5rc"] Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.365765 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5896d91a-6760-4d04-86ac-c45a6da0fa45-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5896d91a-6760-4d04-86ac-c45a6da0fa45\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.365860 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5896d91a-6760-4d04-86ac-c45a6da0fa45-kube-api-access\") pod \"installer-9-crc\" (UID: \"5896d91a-6760-4d04-86ac-c45a6da0fa45\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.365886 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5896d91a-6760-4d04-86ac-c45a6da0fa45-var-lock\") pod \"installer-9-crc\" (UID: \"5896d91a-6760-4d04-86ac-c45a6da0fa45\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.367182 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5896d91a-6760-4d04-86ac-c45a6da0fa45-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5896d91a-6760-4d04-86ac-c45a6da0fa45\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.367694 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5896d91a-6760-4d04-86ac-c45a6da0fa45-var-lock\") pod \"installer-9-crc\" (UID: \"5896d91a-6760-4d04-86ac-c45a6da0fa45\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.389704 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5896d91a-6760-4d04-86ac-c45a6da0fa45-kube-api-access\") pod \"installer-9-crc\" (UID: \"5896d91a-6760-4d04-86ac-c45a6da0fa45\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.498026 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.620448 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-887496984-fshvp" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.670225 5036 patch_prober.go:28] interesting pod/route-controller-manager-67b98d4d88-2z2l2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": read tcp 10.217.0.2:45384->10.217.0.54:8443: read: connection reset by peer" start-of-body= Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.670293 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2" podUID="d96da61f-255c-4bef-9b4b-6c7c34379cd9" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": read tcp 10.217.0.2:45384->10.217.0.54:8443: read: connection reset by peer" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.708156 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.771284 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw6xc\" (UniqueName: \"kubernetes.io/projected/a96d8194-525f-4ac5-98d4-bd04940a23e3-kube-api-access-kw6xc\") pod \"a96d8194-525f-4ac5-98d4-bd04940a23e3\" (UID: \"a96d8194-525f-4ac5-98d4-bd04940a23e3\") " Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.771335 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a96d8194-525f-4ac5-98d4-bd04940a23e3-proxy-ca-bundles\") pod \"a96d8194-525f-4ac5-98d4-bd04940a23e3\" (UID: \"a96d8194-525f-4ac5-98d4-bd04940a23e3\") " Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.771375 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a96d8194-525f-4ac5-98d4-bd04940a23e3-serving-cert\") pod \"a96d8194-525f-4ac5-98d4-bd04940a23e3\" (UID: \"a96d8194-525f-4ac5-98d4-bd04940a23e3\") " Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.771410 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a96d8194-525f-4ac5-98d4-bd04940a23e3-client-ca\") pod \"a96d8194-525f-4ac5-98d4-bd04940a23e3\" (UID: \"a96d8194-525f-4ac5-98d4-bd04940a23e3\") " Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.771477 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a96d8194-525f-4ac5-98d4-bd04940a23e3-config\") pod \"a96d8194-525f-4ac5-98d4-bd04940a23e3\" (UID: \"a96d8194-525f-4ac5-98d4-bd04940a23e3\") " Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.772431 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a96d8194-525f-4ac5-98d4-bd04940a23e3-client-ca" (OuterVolumeSpecName: "client-ca") pod "a96d8194-525f-4ac5-98d4-bd04940a23e3" (UID: "a96d8194-525f-4ac5-98d4-bd04940a23e3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.772479 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a96d8194-525f-4ac5-98d4-bd04940a23e3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a96d8194-525f-4ac5-98d4-bd04940a23e3" (UID: "a96d8194-525f-4ac5-98d4-bd04940a23e3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.772524 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a96d8194-525f-4ac5-98d4-bd04940a23e3-config" (OuterVolumeSpecName: "config") pod "a96d8194-525f-4ac5-98d4-bd04940a23e3" (UID: "a96d8194-525f-4ac5-98d4-bd04940a23e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.777707 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a96d8194-525f-4ac5-98d4-bd04940a23e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a96d8194-525f-4ac5-98d4-bd04940a23e3" (UID: "a96d8194-525f-4ac5-98d4-bd04940a23e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.777975 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a96d8194-525f-4ac5-98d4-bd04940a23e3-kube-api-access-kw6xc" (OuterVolumeSpecName: "kube-api-access-kw6xc") pod "a96d8194-525f-4ac5-98d4-bd04940a23e3" (UID: "a96d8194-525f-4ac5-98d4-bd04940a23e3"). InnerVolumeSpecName "kube-api-access-kw6xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.873325 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw6xc\" (UniqueName: \"kubernetes.io/projected/a96d8194-525f-4ac5-98d4-bd04940a23e3-kube-api-access-kw6xc\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.873361 5036 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a96d8194-525f-4ac5-98d4-bd04940a23e3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.873373 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a96d8194-525f-4ac5-98d4-bd04940a23e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.873384 5036 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a96d8194-525f-4ac5-98d4-bd04940a23e3-client-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.873396 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a96d8194-525f-4ac5-98d4-bd04940a23e3-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.954663 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-67b98d4d88-2z2l2_d96da61f-255c-4bef-9b4b-6c7c34379cd9/route-controller-manager/0.log" Jan 10 16:29:27 crc kubenswrapper[5036]: I0110 16:29:27.954747 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.075919 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d96da61f-255c-4bef-9b4b-6c7c34379cd9-client-ca\") pod \"d96da61f-255c-4bef-9b4b-6c7c34379cd9\" (UID: \"d96da61f-255c-4bef-9b4b-6c7c34379cd9\") " Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.076318 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm2tl\" (UniqueName: \"kubernetes.io/projected/d96da61f-255c-4bef-9b4b-6c7c34379cd9-kube-api-access-pm2tl\") pod \"d96da61f-255c-4bef-9b4b-6c7c34379cd9\" (UID: \"d96da61f-255c-4bef-9b4b-6c7c34379cd9\") " Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.076391 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d96da61f-255c-4bef-9b4b-6c7c34379cd9-serving-cert\") pod \"d96da61f-255c-4bef-9b4b-6c7c34379cd9\" (UID: \"d96da61f-255c-4bef-9b4b-6c7c34379cd9\") " Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.076451 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d96da61f-255c-4bef-9b4b-6c7c34379cd9-config\") pod \"d96da61f-255c-4bef-9b4b-6c7c34379cd9\" (UID: \"d96da61f-255c-4bef-9b4b-6c7c34379cd9\") " Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.076670 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d96da61f-255c-4bef-9b4b-6c7c34379cd9-client-ca" (OuterVolumeSpecName: "client-ca") pod "d96da61f-255c-4bef-9b4b-6c7c34379cd9" (UID: "d96da61f-255c-4bef-9b4b-6c7c34379cd9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.077263 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d96da61f-255c-4bef-9b4b-6c7c34379cd9-config" (OuterVolumeSpecName: "config") pod "d96da61f-255c-4bef-9b4b-6c7c34379cd9" (UID: "d96da61f-255c-4bef-9b4b-6c7c34379cd9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.080747 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d96da61f-255c-4bef-9b4b-6c7c34379cd9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d96da61f-255c-4bef-9b4b-6c7c34379cd9" (UID: "d96da61f-255c-4bef-9b4b-6c7c34379cd9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.080828 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d96da61f-255c-4bef-9b4b-6c7c34379cd9-kube-api-access-pm2tl" (OuterVolumeSpecName: "kube-api-access-pm2tl") pod "d96da61f-255c-4bef-9b4b-6c7c34379cd9" (UID: "d96da61f-255c-4bef-9b4b-6c7c34379cd9"). InnerVolumeSpecName "kube-api-access-pm2tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.177948 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d96da61f-255c-4bef-9b4b-6c7c34379cd9-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.177989 5036 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d96da61f-255c-4bef-9b4b-6c7c34379cd9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.178005 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm2tl\" (UniqueName: \"kubernetes.io/projected/d96da61f-255c-4bef-9b4b-6c7c34379cd9-kube-api-access-pm2tl\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.178017 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d96da61f-255c-4bef-9b4b-6c7c34379cd9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.263229 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-67b98d4d88-2z2l2_d96da61f-255c-4bef-9b4b-6c7c34379cd9/route-controller-manager/0.log" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.263535 5036 generic.go:334] "Generic (PLEG): container finished" podID="d96da61f-255c-4bef-9b4b-6c7c34379cd9" containerID="c6cc19b055fff3fc65b8f0e608e340d08f821eee241b1ebab7a81780a04c4a26" exitCode=255 Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.263636 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.263583 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2" event={"ID":"d96da61f-255c-4bef-9b4b-6c7c34379cd9","Type":"ContainerDied","Data":"c6cc19b055fff3fc65b8f0e608e340d08f821eee241b1ebab7a81780a04c4a26"} Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.263795 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2" event={"ID":"d96da61f-255c-4bef-9b4b-6c7c34379cd9","Type":"ContainerDied","Data":"c1adcf93fd5a2faaf6d99a3efe28f2c806c361aaf0e6129e48c80a5e81dedf75"} Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.263820 5036 scope.go:117] "RemoveContainer" containerID="c6cc19b055fff3fc65b8f0e608e340d08f821eee241b1ebab7a81780a04c4a26" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.267424 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5896d91a-6760-4d04-86ac-c45a6da0fa45","Type":"ContainerStarted","Data":"423c07e10c4a142c22bf8e51ebcbbd4b2c850c19c60037e6877ee00e43f730b0"} Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.267469 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5896d91a-6760-4d04-86ac-c45a6da0fa45","Type":"ContainerStarted","Data":"3f244f611ae4ffa4603009bc2fd908f1522e7c36f47cec9fc37ff3d3fadf2b8b"} Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.269663 5036 generic.go:334] "Generic (PLEG): container finished" podID="a96d8194-525f-4ac5-98d4-bd04940a23e3" containerID="40856f0a54202bae9f97c7c8fc907b83bead7a3bde8258f2a65c8c2b5ebf9021" exitCode=0 Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.269732 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-887496984-fshvp" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.269729 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-887496984-fshvp" event={"ID":"a96d8194-525f-4ac5-98d4-bd04940a23e3","Type":"ContainerDied","Data":"40856f0a54202bae9f97c7c8fc907b83bead7a3bde8258f2a65c8c2b5ebf9021"} Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.269789 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-887496984-fshvp" event={"ID":"a96d8194-525f-4ac5-98d4-bd04940a23e3","Type":"ContainerDied","Data":"28c0d3f1ccf17d8bd352126f513cf01e1e5e8713b73ab8412f512ed30fe34770"} Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.274536 5036 generic.go:334] "Generic (PLEG): container finished" podID="20042ba4-9f8c-48b8-85e4-97d8ea2ad51b" containerID="32246a4895762321a56e6da2df3eef3abadddabf88e82aa3787894da4f89feff" exitCode=0 Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.274754 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"20042ba4-9f8c-48b8-85e4-97d8ea2ad51b","Type":"ContainerDied","Data":"32246a4895762321a56e6da2df3eef3abadddabf88e82aa3787894da4f89feff"} Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.282553 5036 scope.go:117] "RemoveContainer" containerID="c6cc19b055fff3fc65b8f0e608e340d08f821eee241b1ebab7a81780a04c4a26" Jan 10 16:29:28 crc kubenswrapper[5036]: E0110 16:29:28.283019 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6cc19b055fff3fc65b8f0e608e340d08f821eee241b1ebab7a81780a04c4a26\": container with ID starting with c6cc19b055fff3fc65b8f0e608e340d08f821eee241b1ebab7a81780a04c4a26 not found: ID does not exist" containerID="c6cc19b055fff3fc65b8f0e608e340d08f821eee241b1ebab7a81780a04c4a26" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.283061 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6cc19b055fff3fc65b8f0e608e340d08f821eee241b1ebab7a81780a04c4a26"} err="failed to get container status \"c6cc19b055fff3fc65b8f0e608e340d08f821eee241b1ebab7a81780a04c4a26\": rpc error: code = NotFound desc = could not find container \"c6cc19b055fff3fc65b8f0e608e340d08f821eee241b1ebab7a81780a04c4a26\": container with ID starting with c6cc19b055fff3fc65b8f0e608e340d08f821eee241b1ebab7a81780a04c4a26 not found: ID does not exist" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.283112 5036 scope.go:117] "RemoveContainer" containerID="40856f0a54202bae9f97c7c8fc907b83bead7a3bde8258f2a65c8c2b5ebf9021" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.296380 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.296336434 podStartE2EDuration="1.296336434s" podCreationTimestamp="2026-01-10 16:29:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:29:28.28992528 +0000 UTC m=+90.160160784" watchObservedRunningTime="2026-01-10 16:29:28.296336434 +0000 UTC m=+90.166571938" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.323259 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2"] Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.323618 5036 scope.go:117] "RemoveContainer" containerID="40856f0a54202bae9f97c7c8fc907b83bead7a3bde8258f2a65c8c2b5ebf9021" Jan 10 16:29:28 crc kubenswrapper[5036]: E0110 16:29:28.324564 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40856f0a54202bae9f97c7c8fc907b83bead7a3bde8258f2a65c8c2b5ebf9021\": container with ID starting with 40856f0a54202bae9f97c7c8fc907b83bead7a3bde8258f2a65c8c2b5ebf9021 not found: ID does not exist" containerID="40856f0a54202bae9f97c7c8fc907b83bead7a3bde8258f2a65c8c2b5ebf9021" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.324612 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40856f0a54202bae9f97c7c8fc907b83bead7a3bde8258f2a65c8c2b5ebf9021"} err="failed to get container status \"40856f0a54202bae9f97c7c8fc907b83bead7a3bde8258f2a65c8c2b5ebf9021\": rpc error: code = NotFound desc = could not find container \"40856f0a54202bae9f97c7c8fc907b83bead7a3bde8258f2a65c8c2b5ebf9021\": container with ID starting with 40856f0a54202bae9f97c7c8fc907b83bead7a3bde8258f2a65c8c2b5ebf9021 not found: ID does not exist" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.327298 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67b98d4d88-2z2l2"] Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.336763 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-887496984-fshvp"] Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.339370 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-887496984-fshvp"] Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.517634 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56edcbe7-428c-4373-928d-b2fdf97a0a3a" path="/var/lib/kubelet/pods/56edcbe7-428c-4373-928d-b2fdf97a0a3a/volumes" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.519128 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a96d8194-525f-4ac5-98d4-bd04940a23e3" path="/var/lib/kubelet/pods/a96d8194-525f-4ac5-98d4-bd04940a23e3/volumes" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.519776 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d96da61f-255c-4bef-9b4b-6c7c34379cd9" path="/var/lib/kubelet/pods/d96da61f-255c-4bef-9b4b-6c7c34379cd9/volumes" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.794879 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8"] Jan 10 16:29:28 crc kubenswrapper[5036]: E0110 16:29:28.796372 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a96d8194-525f-4ac5-98d4-bd04940a23e3" containerName="controller-manager" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.796396 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="a96d8194-525f-4ac5-98d4-bd04940a23e3" containerName="controller-manager" Jan 10 16:29:28 crc kubenswrapper[5036]: E0110 16:29:28.796421 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d96da61f-255c-4bef-9b4b-6c7c34379cd9" containerName="route-controller-manager" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.796430 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="d96da61f-255c-4bef-9b4b-6c7c34379cd9" containerName="route-controller-manager" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.796564 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="d96da61f-255c-4bef-9b4b-6c7c34379cd9" containerName="route-controller-manager" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.796579 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="a96d8194-525f-4ac5-98d4-bd04940a23e3" containerName="controller-manager" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.797088 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.798470 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw"] Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.799173 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.800198 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.800652 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.800881 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.801606 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.801882 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.802046 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.802199 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.802336 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.804323 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.804679 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.805248 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.805484 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.810134 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8"] Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.810853 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.814383 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw"] Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.889161 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/beab962f-cd17-4571-be06-3df435921898-client-ca\") pod \"route-controller-manager-9d85fcf98-226m8\" (UID: \"beab962f-cd17-4571-be06-3df435921898\") " pod="openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.889216 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beab962f-cd17-4571-be06-3df435921898-serving-cert\") pod \"route-controller-manager-9d85fcf98-226m8\" (UID: \"beab962f-cd17-4571-be06-3df435921898\") " pod="openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.889253 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7v9x\" (UniqueName: \"kubernetes.io/projected/beab962f-cd17-4571-be06-3df435921898-kube-api-access-d7v9x\") pod \"route-controller-manager-9d85fcf98-226m8\" (UID: \"beab962f-cd17-4571-be06-3df435921898\") " pod="openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.889289 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4428fbb6-3ebd-48cf-8186-002053b880cd-config\") pod \"controller-manager-c5d4c95dc-zjqgw\" (UID: \"4428fbb6-3ebd-48cf-8186-002053b880cd\") " pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.889319 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7lnh\" (UniqueName: \"kubernetes.io/projected/4428fbb6-3ebd-48cf-8186-002053b880cd-kube-api-access-g7lnh\") pod \"controller-manager-c5d4c95dc-zjqgw\" (UID: \"4428fbb6-3ebd-48cf-8186-002053b880cd\") " pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.889573 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4428fbb6-3ebd-48cf-8186-002053b880cd-client-ca\") pod \"controller-manager-c5d4c95dc-zjqgw\" (UID: \"4428fbb6-3ebd-48cf-8186-002053b880cd\") " pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.889664 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beab962f-cd17-4571-be06-3df435921898-config\") pod \"route-controller-manager-9d85fcf98-226m8\" (UID: \"beab962f-cd17-4571-be06-3df435921898\") " pod="openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.889760 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4428fbb6-3ebd-48cf-8186-002053b880cd-proxy-ca-bundles\") pod \"controller-manager-c5d4c95dc-zjqgw\" (UID: \"4428fbb6-3ebd-48cf-8186-002053b880cd\") " pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.889812 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4428fbb6-3ebd-48cf-8186-002053b880cd-serving-cert\") pod \"controller-manager-c5d4c95dc-zjqgw\" (UID: \"4428fbb6-3ebd-48cf-8186-002053b880cd\") " pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.991065 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4428fbb6-3ebd-48cf-8186-002053b880cd-config\") pod \"controller-manager-c5d4c95dc-zjqgw\" (UID: \"4428fbb6-3ebd-48cf-8186-002053b880cd\") " pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.991495 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7lnh\" (UniqueName: \"kubernetes.io/projected/4428fbb6-3ebd-48cf-8186-002053b880cd-kube-api-access-g7lnh\") pod \"controller-manager-c5d4c95dc-zjqgw\" (UID: \"4428fbb6-3ebd-48cf-8186-002053b880cd\") " pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.991620 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4428fbb6-3ebd-48cf-8186-002053b880cd-client-ca\") pod \"controller-manager-c5d4c95dc-zjqgw\" (UID: \"4428fbb6-3ebd-48cf-8186-002053b880cd\") " pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.991784 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beab962f-cd17-4571-be06-3df435921898-config\") pod \"route-controller-manager-9d85fcf98-226m8\" (UID: \"beab962f-cd17-4571-be06-3df435921898\") " pod="openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.991900 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4428fbb6-3ebd-48cf-8186-002053b880cd-proxy-ca-bundles\") pod \"controller-manager-c5d4c95dc-zjqgw\" (UID: \"4428fbb6-3ebd-48cf-8186-002053b880cd\") " pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.992019 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4428fbb6-3ebd-48cf-8186-002053b880cd-serving-cert\") pod \"controller-manager-c5d4c95dc-zjqgw\" (UID: \"4428fbb6-3ebd-48cf-8186-002053b880cd\") " pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.992138 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/beab962f-cd17-4571-be06-3df435921898-client-ca\") pod \"route-controller-manager-9d85fcf98-226m8\" (UID: \"beab962f-cd17-4571-be06-3df435921898\") " pod="openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.992222 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beab962f-cd17-4571-be06-3df435921898-serving-cert\") pod \"route-controller-manager-9d85fcf98-226m8\" (UID: \"beab962f-cd17-4571-be06-3df435921898\") " pod="openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.992337 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7v9x\" (UniqueName: \"kubernetes.io/projected/beab962f-cd17-4571-be06-3df435921898-kube-api-access-d7v9x\") pod \"route-controller-manager-9d85fcf98-226m8\" (UID: \"beab962f-cd17-4571-be06-3df435921898\") " pod="openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.992999 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/beab962f-cd17-4571-be06-3df435921898-client-ca\") pod \"route-controller-manager-9d85fcf98-226m8\" (UID: \"beab962f-cd17-4571-be06-3df435921898\") " pod="openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.993028 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4428fbb6-3ebd-48cf-8186-002053b880cd-proxy-ca-bundles\") pod \"controller-manager-c5d4c95dc-zjqgw\" (UID: \"4428fbb6-3ebd-48cf-8186-002053b880cd\") " pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.993539 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4428fbb6-3ebd-48cf-8186-002053b880cd-client-ca\") pod \"controller-manager-c5d4c95dc-zjqgw\" (UID: \"4428fbb6-3ebd-48cf-8186-002053b880cd\") " pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.994044 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beab962f-cd17-4571-be06-3df435921898-config\") pod \"route-controller-manager-9d85fcf98-226m8\" (UID: \"beab962f-cd17-4571-be06-3df435921898\") " pod="openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8" Jan 10 16:29:28 crc kubenswrapper[5036]: I0110 16:29:28.994053 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4428fbb6-3ebd-48cf-8186-002053b880cd-config\") pod \"controller-manager-c5d4c95dc-zjqgw\" (UID: \"4428fbb6-3ebd-48cf-8186-002053b880cd\") " pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" Jan 10 16:29:29 crc kubenswrapper[5036]: I0110 16:29:29.000127 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beab962f-cd17-4571-be06-3df435921898-serving-cert\") pod \"route-controller-manager-9d85fcf98-226m8\" (UID: \"beab962f-cd17-4571-be06-3df435921898\") " pod="openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8" Jan 10 16:29:29 crc kubenswrapper[5036]: I0110 16:29:29.001093 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4428fbb6-3ebd-48cf-8186-002053b880cd-serving-cert\") pod \"controller-manager-c5d4c95dc-zjqgw\" (UID: \"4428fbb6-3ebd-48cf-8186-002053b880cd\") " pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" Jan 10 16:29:29 crc kubenswrapper[5036]: I0110 16:29:29.012057 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7v9x\" (UniqueName: \"kubernetes.io/projected/beab962f-cd17-4571-be06-3df435921898-kube-api-access-d7v9x\") pod \"route-controller-manager-9d85fcf98-226m8\" (UID: \"beab962f-cd17-4571-be06-3df435921898\") " pod="openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8" Jan 10 16:29:29 crc kubenswrapper[5036]: I0110 16:29:29.020401 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7lnh\" (UniqueName: \"kubernetes.io/projected/4428fbb6-3ebd-48cf-8186-002053b880cd-kube-api-access-g7lnh\") pod \"controller-manager-c5d4c95dc-zjqgw\" (UID: \"4428fbb6-3ebd-48cf-8186-002053b880cd\") " pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" Jan 10 16:29:29 crc kubenswrapper[5036]: I0110 16:29:29.127046 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8" Jan 10 16:29:29 crc kubenswrapper[5036]: I0110 16:29:29.138113 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" Jan 10 16:29:29 crc kubenswrapper[5036]: I0110 16:29:29.380109 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw"] Jan 10 16:29:29 crc kubenswrapper[5036]: W0110 16:29:29.388435 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4428fbb6_3ebd_48cf_8186_002053b880cd.slice/crio-53786066f6b85032c430dbb79d73abbb66d80bcfc282425612eda0c6e4597f14 WatchSource:0}: Error finding container 53786066f6b85032c430dbb79d73abbb66d80bcfc282425612eda0c6e4597f14: Status 404 returned error can't find the container with id 53786066f6b85032c430dbb79d73abbb66d80bcfc282425612eda0c6e4597f14 Jan 10 16:29:29 crc kubenswrapper[5036]: I0110 16:29:29.544533 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 10 16:29:29 crc kubenswrapper[5036]: I0110 16:29:29.651113 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8"] Jan 10 16:29:29 crc kubenswrapper[5036]: I0110 16:29:29.702835 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20042ba4-9f8c-48b8-85e4-97d8ea2ad51b-kube-api-access\") pod \"20042ba4-9f8c-48b8-85e4-97d8ea2ad51b\" (UID: \"20042ba4-9f8c-48b8-85e4-97d8ea2ad51b\") " Jan 10 16:29:29 crc kubenswrapper[5036]: I0110 16:29:29.702935 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/20042ba4-9f8c-48b8-85e4-97d8ea2ad51b-kubelet-dir\") pod \"20042ba4-9f8c-48b8-85e4-97d8ea2ad51b\" (UID: \"20042ba4-9f8c-48b8-85e4-97d8ea2ad51b\") " Jan 10 16:29:29 crc kubenswrapper[5036]: I0110 16:29:29.703341 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20042ba4-9f8c-48b8-85e4-97d8ea2ad51b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "20042ba4-9f8c-48b8-85e4-97d8ea2ad51b" (UID: "20042ba4-9f8c-48b8-85e4-97d8ea2ad51b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:29:29 crc kubenswrapper[5036]: I0110 16:29:29.710439 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20042ba4-9f8c-48b8-85e4-97d8ea2ad51b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "20042ba4-9f8c-48b8-85e4-97d8ea2ad51b" (UID: "20042ba4-9f8c-48b8-85e4-97d8ea2ad51b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:29:29 crc kubenswrapper[5036]: I0110 16:29:29.804775 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20042ba4-9f8c-48b8-85e4-97d8ea2ad51b-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:29 crc kubenswrapper[5036]: I0110 16:29:29.804819 5036 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/20042ba4-9f8c-48b8-85e4-97d8ea2ad51b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:30 crc kubenswrapper[5036]: I0110 16:29:30.298022 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8" event={"ID":"beab962f-cd17-4571-be06-3df435921898","Type":"ContainerStarted","Data":"fa4021f1842d2e90cf9e790f772bba3684246177d018b3f5c25541e58421f4e3"} Jan 10 16:29:30 crc kubenswrapper[5036]: I0110 16:29:30.298588 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8" event={"ID":"beab962f-cd17-4571-be06-3df435921898","Type":"ContainerStarted","Data":"306788ae69f5510c532c6526fbb05e398cc4527fa7227ec845f1313f64bd19cc"} Jan 10 16:29:30 crc kubenswrapper[5036]: I0110 16:29:30.300621 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8" Jan 10 16:29:30 crc kubenswrapper[5036]: I0110 16:29:30.302591 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 10 16:29:30 crc kubenswrapper[5036]: I0110 16:29:30.302593 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"20042ba4-9f8c-48b8-85e4-97d8ea2ad51b","Type":"ContainerDied","Data":"eeb29231db70a51aa59c07b17ce54dbe5a15bc439f566873147e38bfd529856f"} Jan 10 16:29:30 crc kubenswrapper[5036]: I0110 16:29:30.302931 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eeb29231db70a51aa59c07b17ce54dbe5a15bc439f566873147e38bfd529856f" Jan 10 16:29:30 crc kubenswrapper[5036]: I0110 16:29:30.307124 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" event={"ID":"4428fbb6-3ebd-48cf-8186-002053b880cd","Type":"ContainerStarted","Data":"9f4708e992a7b293e5414dfc878f14ef1d02787c7f71a6108911e0300f5aa842"} Jan 10 16:29:30 crc kubenswrapper[5036]: I0110 16:29:30.307173 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" event={"ID":"4428fbb6-3ebd-48cf-8186-002053b880cd","Type":"ContainerStarted","Data":"53786066f6b85032c430dbb79d73abbb66d80bcfc282425612eda0c6e4597f14"} Jan 10 16:29:30 crc kubenswrapper[5036]: I0110 16:29:30.307660 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" Jan 10 16:29:30 crc kubenswrapper[5036]: I0110 16:29:30.322592 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8" podStartSLOduration=15.32256883 podStartE2EDuration="15.32256883s" podCreationTimestamp="2026-01-10 16:29:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:29:30.321387486 +0000 UTC m=+92.191623070" watchObservedRunningTime="2026-01-10 16:29:30.32256883 +0000 UTC m=+92.192804364" Jan 10 16:29:30 crc kubenswrapper[5036]: I0110 16:29:30.329843 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" Jan 10 16:29:30 crc kubenswrapper[5036]: I0110 16:29:30.350861 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" podStartSLOduration=15.350751708 podStartE2EDuration="15.350751708s" podCreationTimestamp="2026-01-10 16:29:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:29:30.341971066 +0000 UTC m=+92.212206570" watchObservedRunningTime="2026-01-10 16:29:30.350751708 +0000 UTC m=+92.220987212" Jan 10 16:29:30 crc kubenswrapper[5036]: I0110 16:29:30.376518 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8" Jan 10 16:29:36 crc kubenswrapper[5036]: I0110 16:29:36.342471 5036 generic.go:334] "Generic (PLEG): container finished" podID="0239b380-03c8-455e-a981-2aaaae000828" containerID="809d23e190d035e9e01b9e73a924074d0a081f50ab385fd0e747c36895f150db" exitCode=0 Jan 10 16:29:36 crc kubenswrapper[5036]: I0110 16:29:36.342518 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w55c" event={"ID":"0239b380-03c8-455e-a981-2aaaae000828","Type":"ContainerDied","Data":"809d23e190d035e9e01b9e73a924074d0a081f50ab385fd0e747c36895f150db"} Jan 10 16:29:37 crc kubenswrapper[5036]: I0110 16:29:37.352174 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w55c" event={"ID":"0239b380-03c8-455e-a981-2aaaae000828","Type":"ContainerStarted","Data":"6f4053a3a05ce1f148133370d38c55b4aef276f6896d87d35d0ee0ceab03a1b0"} Jan 10 16:29:37 crc kubenswrapper[5036]: I0110 16:29:37.374856 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9w55c" podStartSLOduration=2.8954856209999997 podStartE2EDuration="56.374837008s" podCreationTimestamp="2026-01-10 16:28:41 +0000 UTC" firstStartedPulling="2026-01-10 16:28:43.260484034 +0000 UTC m=+45.130719518" lastFinishedPulling="2026-01-10 16:29:36.739835411 +0000 UTC m=+98.610070905" observedRunningTime="2026-01-10 16:29:37.372622304 +0000 UTC m=+99.242857818" watchObservedRunningTime="2026-01-10 16:29:37.374837008 +0000 UTC m=+99.245072492" Jan 10 16:29:38 crc kubenswrapper[5036]: I0110 16:29:38.360067 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v2wzb" event={"ID":"9f0cd226-9f92-4ef2-a82b-7746983ab42e","Type":"ContainerStarted","Data":"e26107c10d9c4c743bf686de9d7e1a50a6b7f23bce877c902d1af348e25f2ef2"} Jan 10 16:29:39 crc kubenswrapper[5036]: I0110 16:29:39.367876 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsfx8" event={"ID":"ea0d5867-9889-49bd-b23e-545606295a7a","Type":"ContainerStarted","Data":"5b33f13f3be25151c0fb0bf24c0ffe45edd5991b4fc54f8feca321adcd55d48c"} Jan 10 16:29:39 crc kubenswrapper[5036]: I0110 16:29:39.371852 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hm8ns" event={"ID":"1513baef-e92c-4399-ae0f-b8fe4a738702","Type":"ContainerStarted","Data":"160b1bef80c23df20bbcfc9cab886dc4708b607b4118e3856a5b3d0f4d9299e8"} Jan 10 16:29:39 crc kubenswrapper[5036]: I0110 16:29:39.375164 5036 generic.go:334] "Generic (PLEG): container finished" podID="9f0cd226-9f92-4ef2-a82b-7746983ab42e" containerID="e26107c10d9c4c743bf686de9d7e1a50a6b7f23bce877c902d1af348e25f2ef2" exitCode=0 Jan 10 16:29:39 crc kubenswrapper[5036]: I0110 16:29:39.375204 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v2wzb" event={"ID":"9f0cd226-9f92-4ef2-a82b-7746983ab42e","Type":"ContainerDied","Data":"e26107c10d9c4c743bf686de9d7e1a50a6b7f23bce877c902d1af348e25f2ef2"} Jan 10 16:29:40 crc kubenswrapper[5036]: I0110 16:29:40.385526 5036 generic.go:334] "Generic (PLEG): container finished" podID="fe3cdeec-7336-463c-bbbb-488ece81fa0b" containerID="7fc831a54a1f69526e6455d38235ba018de24daca5ed5ba16eca3058458ca6f6" exitCode=0 Jan 10 16:29:40 crc kubenswrapper[5036]: I0110 16:29:40.385621 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q72r7" event={"ID":"fe3cdeec-7336-463c-bbbb-488ece81fa0b","Type":"ContainerDied","Data":"7fc831a54a1f69526e6455d38235ba018de24daca5ed5ba16eca3058458ca6f6"} Jan 10 16:29:40 crc kubenswrapper[5036]: I0110 16:29:40.393769 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v2wzb" event={"ID":"9f0cd226-9f92-4ef2-a82b-7746983ab42e","Type":"ContainerStarted","Data":"d1fd211338334113f068dfb4163ac699814a56cf8ec915cfc242fed52b1e661f"} Jan 10 16:29:40 crc kubenswrapper[5036]: I0110 16:29:40.402446 5036 generic.go:334] "Generic (PLEG): container finished" podID="ea0d5867-9889-49bd-b23e-545606295a7a" containerID="5b33f13f3be25151c0fb0bf24c0ffe45edd5991b4fc54f8feca321adcd55d48c" exitCode=0 Jan 10 16:29:40 crc kubenswrapper[5036]: I0110 16:29:40.402539 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsfx8" event={"ID":"ea0d5867-9889-49bd-b23e-545606295a7a","Type":"ContainerDied","Data":"5b33f13f3be25151c0fb0bf24c0ffe45edd5991b4fc54f8feca321adcd55d48c"} Jan 10 16:29:40 crc kubenswrapper[5036]: I0110 16:29:40.407334 5036 generic.go:334] "Generic (PLEG): container finished" podID="3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9" containerID="0e37436c1a1dc0f3042a894790da2fdc775a75f4694f9a223bd66705190174ca" exitCode=0 Jan 10 16:29:40 crc kubenswrapper[5036]: I0110 16:29:40.407403 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-czknz" event={"ID":"3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9","Type":"ContainerDied","Data":"0e37436c1a1dc0f3042a894790da2fdc775a75f4694f9a223bd66705190174ca"} Jan 10 16:29:40 crc kubenswrapper[5036]: I0110 16:29:40.420495 5036 generic.go:334] "Generic (PLEG): container finished" podID="1513baef-e92c-4399-ae0f-b8fe4a738702" containerID="160b1bef80c23df20bbcfc9cab886dc4708b607b4118e3856a5b3d0f4d9299e8" exitCode=0 Jan 10 16:29:40 crc kubenswrapper[5036]: I0110 16:29:40.420563 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hm8ns" event={"ID":"1513baef-e92c-4399-ae0f-b8fe4a738702","Type":"ContainerDied","Data":"160b1bef80c23df20bbcfc9cab886dc4708b607b4118e3856a5b3d0f4d9299e8"} Jan 10 16:29:40 crc kubenswrapper[5036]: I0110 16:29:40.476057 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v2wzb" podStartSLOduration=2.894778239 podStartE2EDuration="57.476033131s" podCreationTimestamp="2026-01-10 16:28:43 +0000 UTC" firstStartedPulling="2026-01-10 16:28:45.371286032 +0000 UTC m=+47.241521526" lastFinishedPulling="2026-01-10 16:29:39.952540924 +0000 UTC m=+101.822776418" observedRunningTime="2026-01-10 16:29:40.45089307 +0000 UTC m=+102.321128574" watchObservedRunningTime="2026-01-10 16:29:40.476033131 +0000 UTC m=+102.346268635" Jan 10 16:29:41 crc kubenswrapper[5036]: I0110 16:29:41.427701 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hm8ns" event={"ID":"1513baef-e92c-4399-ae0f-b8fe4a738702","Type":"ContainerStarted","Data":"66bf421f94760e55957470aa75bef64d0a5e3876c85a0cb5563735e8e71a03e0"} Jan 10 16:29:41 crc kubenswrapper[5036]: I0110 16:29:41.438539 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-czknz" event={"ID":"3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9","Type":"ContainerStarted","Data":"6404654d6ee045b6e3428d393ec1a4f40dde0bdf72872464398c4fffb444434c"} Jan 10 16:29:41 crc kubenswrapper[5036]: I0110 16:29:41.441045 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q72r7" event={"ID":"fe3cdeec-7336-463c-bbbb-488ece81fa0b","Type":"ContainerStarted","Data":"78c3e608babe4cdf0d60330e200910ac080982ae4b52a7bc14a6f3e582767227"} Jan 10 16:29:41 crc kubenswrapper[5036]: I0110 16:29:41.443239 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsfx8" event={"ID":"ea0d5867-9889-49bd-b23e-545606295a7a","Type":"ContainerStarted","Data":"fd8a828ed95a2eef22cf778e07a1bdb83f9f42736c0b194a382f560981080587"} Jan 10 16:29:41 crc kubenswrapper[5036]: I0110 16:29:41.456836 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hm8ns" podStartSLOduration=2.828776183 podStartE2EDuration="1m0.456821387s" podCreationTimestamp="2026-01-10 16:28:41 +0000 UTC" firstStartedPulling="2026-01-10 16:28:43.24998932 +0000 UTC m=+45.120224814" lastFinishedPulling="2026-01-10 16:29:40.878034524 +0000 UTC m=+102.748270018" observedRunningTime="2026-01-10 16:29:41.452110402 +0000 UTC m=+103.322345896" watchObservedRunningTime="2026-01-10 16:29:41.456821387 +0000 UTC m=+103.327056881" Jan 10 16:29:41 crc kubenswrapper[5036]: I0110 16:29:41.473492 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lsfx8" podStartSLOduration=2.878340382 podStartE2EDuration="1m0.473474235s" podCreationTimestamp="2026-01-10 16:28:41 +0000 UTC" firstStartedPulling="2026-01-10 16:28:43.270636729 +0000 UTC m=+45.140872223" lastFinishedPulling="2026-01-10 16:29:40.865770582 +0000 UTC m=+102.736006076" observedRunningTime="2026-01-10 16:29:41.471009914 +0000 UTC m=+103.341245408" watchObservedRunningTime="2026-01-10 16:29:41.473474235 +0000 UTC m=+103.343709729" Jan 10 16:29:41 crc kubenswrapper[5036]: I0110 16:29:41.507126 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-czknz" podStartSLOduration=2.947625854 podStartE2EDuration="1m0.50709128s" podCreationTimestamp="2026-01-10 16:28:41 +0000 UTC" firstStartedPulling="2026-01-10 16:28:43.249872237 +0000 UTC m=+45.120107731" lastFinishedPulling="2026-01-10 16:29:40.809337663 +0000 UTC m=+102.679573157" observedRunningTime="2026-01-10 16:29:41.500566562 +0000 UTC m=+103.370802056" watchObservedRunningTime="2026-01-10 16:29:41.50709128 +0000 UTC m=+103.377326774" Jan 10 16:29:41 crc kubenswrapper[5036]: I0110 16:29:41.522986 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q72r7" podStartSLOduration=3.061470047 podStartE2EDuration="57.522965245s" podCreationTimestamp="2026-01-10 16:28:44 +0000 UTC" firstStartedPulling="2026-01-10 16:28:46.46828049 +0000 UTC m=+48.338515974" lastFinishedPulling="2026-01-10 16:29:40.929775678 +0000 UTC m=+102.800011172" observedRunningTime="2026-01-10 16:29:41.5199961 +0000 UTC m=+103.390231594" watchObservedRunningTime="2026-01-10 16:29:41.522965245 +0000 UTC m=+103.393200739" Jan 10 16:29:41 crc kubenswrapper[5036]: I0110 16:29:41.655022 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hm8ns" Jan 10 16:29:41 crc kubenswrapper[5036]: I0110 16:29:41.655084 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hm8ns" Jan 10 16:29:41 crc kubenswrapper[5036]: I0110 16:29:41.732248 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9w55c" Jan 10 16:29:41 crc kubenswrapper[5036]: I0110 16:29:41.732311 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9w55c" Jan 10 16:29:41 crc kubenswrapper[5036]: I0110 16:29:41.779247 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9w55c" Jan 10 16:29:41 crc kubenswrapper[5036]: I0110 16:29:41.870020 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-czknz" Jan 10 16:29:41 crc kubenswrapper[5036]: I0110 16:29:41.870237 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-czknz" Jan 10 16:29:42 crc kubenswrapper[5036]: I0110 16:29:42.057494 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lsfx8" Jan 10 16:29:42 crc kubenswrapper[5036]: I0110 16:29:42.057547 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lsfx8" Jan 10 16:29:42 crc kubenswrapper[5036]: I0110 16:29:42.520246 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9w55c" Jan 10 16:29:42 crc kubenswrapper[5036]: I0110 16:29:42.755819 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-hm8ns" podUID="1513baef-e92c-4399-ae0f-b8fe4a738702" containerName="registry-server" probeResult="failure" output=< Jan 10 16:29:42 crc kubenswrapper[5036]: timeout: failed to connect service ":50051" within 1s Jan 10 16:29:42 crc kubenswrapper[5036]: > Jan 10 16:29:42 crc kubenswrapper[5036]: I0110 16:29:42.923299 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-czknz" podUID="3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9" containerName="registry-server" probeResult="failure" output=< Jan 10 16:29:42 crc kubenswrapper[5036]: timeout: failed to connect service ":50051" within 1s Jan 10 16:29:42 crc kubenswrapper[5036]: > Jan 10 16:29:43 crc kubenswrapper[5036]: I0110 16:29:43.099564 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-lsfx8" podUID="ea0d5867-9889-49bd-b23e-545606295a7a" containerName="registry-server" probeResult="failure" output=< Jan 10 16:29:43 crc kubenswrapper[5036]: timeout: failed to connect service ":50051" within 1s Jan 10 16:29:43 crc kubenswrapper[5036]: > Jan 10 16:29:43 crc kubenswrapper[5036]: I0110 16:29:43.456983 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vzvbk" event={"ID":"1efe898b-dc49-41f9-a296-84f826548896","Type":"ContainerStarted","Data":"7360f60e1d34a27bd4e6b2f5d2b89d35233c1f715b63593b09107486627a56e6"} Jan 10 16:29:43 crc kubenswrapper[5036]: I0110 16:29:43.458923 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qz7l" event={"ID":"963d9e81-5aca-4e34-b326-ffb47bcf98ba","Type":"ContainerStarted","Data":"f7e9f58a707c71fa69abec8941146961993888b267ace16b72e39b15a080948d"} Jan 10 16:29:44 crc kubenswrapper[5036]: I0110 16:29:44.018094 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v2wzb" Jan 10 16:29:44 crc kubenswrapper[5036]: I0110 16:29:44.018213 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v2wzb" Jan 10 16:29:44 crc kubenswrapper[5036]: I0110 16:29:44.068021 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v2wzb" Jan 10 16:29:44 crc kubenswrapper[5036]: I0110 16:29:44.466652 5036 generic.go:334] "Generic (PLEG): container finished" podID="1efe898b-dc49-41f9-a296-84f826548896" containerID="7360f60e1d34a27bd4e6b2f5d2b89d35233c1f715b63593b09107486627a56e6" exitCode=0 Jan 10 16:29:44 crc kubenswrapper[5036]: I0110 16:29:44.467381 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vzvbk" event={"ID":"1efe898b-dc49-41f9-a296-84f826548896","Type":"ContainerDied","Data":"7360f60e1d34a27bd4e6b2f5d2b89d35233c1f715b63593b09107486627a56e6"} Jan 10 16:29:44 crc kubenswrapper[5036]: I0110 16:29:44.626718 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q72r7" Jan 10 16:29:44 crc kubenswrapper[5036]: I0110 16:29:44.626793 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q72r7" Jan 10 16:29:45 crc kubenswrapper[5036]: I0110 16:29:45.475956 5036 generic.go:334] "Generic (PLEG): container finished" podID="963d9e81-5aca-4e34-b326-ffb47bcf98ba" containerID="f7e9f58a707c71fa69abec8941146961993888b267ace16b72e39b15a080948d" exitCode=0 Jan 10 16:29:45 crc kubenswrapper[5036]: I0110 16:29:45.476020 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qz7l" event={"ID":"963d9e81-5aca-4e34-b326-ffb47bcf98ba","Type":"ContainerDied","Data":"f7e9f58a707c71fa69abec8941146961993888b267ace16b72e39b15a080948d"} Jan 10 16:29:45 crc kubenswrapper[5036]: I0110 16:29:45.526497 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v2wzb" Jan 10 16:29:45 crc kubenswrapper[5036]: I0110 16:29:45.680536 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-q72r7" podUID="fe3cdeec-7336-463c-bbbb-488ece81fa0b" containerName="registry-server" probeResult="failure" output=< Jan 10 16:29:45 crc kubenswrapper[5036]: timeout: failed to connect service ":50051" within 1s Jan 10 16:29:45 crc kubenswrapper[5036]: > Jan 10 16:29:46 crc kubenswrapper[5036]: I0110 16:29:46.484356 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qz7l" event={"ID":"963d9e81-5aca-4e34-b326-ffb47bcf98ba","Type":"ContainerStarted","Data":"4f03b740a6ecb606caed7eea9820efeca6a80faf8df0137c40b1e60fe4117b93"} Jan 10 16:29:46 crc kubenswrapper[5036]: I0110 16:29:46.486850 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vzvbk" event={"ID":"1efe898b-dc49-41f9-a296-84f826548896","Type":"ContainerStarted","Data":"a282622aa50896a18e685808ad54def38d2975ece4d893c17c637aa1ff4eb1c8"} Jan 10 16:29:46 crc kubenswrapper[5036]: I0110 16:29:46.509128 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8qz7l" podStartSLOduration=3.072481634 podStartE2EDuration="1m2.509106253s" podCreationTimestamp="2026-01-10 16:28:44 +0000 UTC" firstStartedPulling="2026-01-10 16:28:46.467351114 +0000 UTC m=+48.337586608" lastFinishedPulling="2026-01-10 16:29:45.903975733 +0000 UTC m=+107.774211227" observedRunningTime="2026-01-10 16:29:46.505789197 +0000 UTC m=+108.376024711" watchObservedRunningTime="2026-01-10 16:29:46.509106253 +0000 UTC m=+108.379341757" Jan 10 16:29:46 crc kubenswrapper[5036]: I0110 16:29:46.526336 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vzvbk" podStartSLOduration=2.102952371 podStartE2EDuration="1m3.526292976s" podCreationTimestamp="2026-01-10 16:28:43 +0000 UTC" firstStartedPulling="2026-01-10 16:28:44.344550034 +0000 UTC m=+46.214785528" lastFinishedPulling="2026-01-10 16:29:45.767890629 +0000 UTC m=+107.638126133" observedRunningTime="2026-01-10 16:29:46.525872353 +0000 UTC m=+108.396107847" watchObservedRunningTime="2026-01-10 16:29:46.526292976 +0000 UTC m=+108.396528470" Jan 10 16:29:46 crc kubenswrapper[5036]: I0110 16:29:46.763510 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v2wzb"] Jan 10 16:29:47 crc kubenswrapper[5036]: I0110 16:29:47.491628 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v2wzb" podUID="9f0cd226-9f92-4ef2-a82b-7746983ab42e" containerName="registry-server" containerID="cri-o://d1fd211338334113f068dfb4163ac699814a56cf8ec915cfc242fed52b1e661f" gracePeriod=2 Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.477819 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v2wzb" Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.500105 5036 generic.go:334] "Generic (PLEG): container finished" podID="9f0cd226-9f92-4ef2-a82b-7746983ab42e" containerID="d1fd211338334113f068dfb4163ac699814a56cf8ec915cfc242fed52b1e661f" exitCode=0 Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.500162 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v2wzb" event={"ID":"9f0cd226-9f92-4ef2-a82b-7746983ab42e","Type":"ContainerDied","Data":"d1fd211338334113f068dfb4163ac699814a56cf8ec915cfc242fed52b1e661f"} Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.500194 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v2wzb" event={"ID":"9f0cd226-9f92-4ef2-a82b-7746983ab42e","Type":"ContainerDied","Data":"c402861909e7898d6ccb6f0f55744ee44b4d1c60b6157a126658c3077bf9bdb5"} Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.500213 5036 scope.go:117] "RemoveContainer" containerID="d1fd211338334113f068dfb4163ac699814a56cf8ec915cfc242fed52b1e661f" Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.500246 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v2wzb" Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.521380 5036 scope.go:117] "RemoveContainer" containerID="e26107c10d9c4c743bf686de9d7e1a50a6b7f23bce877c902d1af348e25f2ef2" Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.542923 5036 scope.go:117] "RemoveContainer" containerID="8d1c63c1d91eab3349633de5ff5f50cd7bde0343e7f590ed1f4b9f046509dd06" Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.563169 5036 scope.go:117] "RemoveContainer" containerID="d1fd211338334113f068dfb4163ac699814a56cf8ec915cfc242fed52b1e661f" Jan 10 16:29:48 crc kubenswrapper[5036]: E0110 16:29:48.563801 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1fd211338334113f068dfb4163ac699814a56cf8ec915cfc242fed52b1e661f\": container with ID starting with d1fd211338334113f068dfb4163ac699814a56cf8ec915cfc242fed52b1e661f not found: ID does not exist" containerID="d1fd211338334113f068dfb4163ac699814a56cf8ec915cfc242fed52b1e661f" Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.563862 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1fd211338334113f068dfb4163ac699814a56cf8ec915cfc242fed52b1e661f"} err="failed to get container status \"d1fd211338334113f068dfb4163ac699814a56cf8ec915cfc242fed52b1e661f\": rpc error: code = NotFound desc = could not find container \"d1fd211338334113f068dfb4163ac699814a56cf8ec915cfc242fed52b1e661f\": container with ID starting with d1fd211338334113f068dfb4163ac699814a56cf8ec915cfc242fed52b1e661f not found: ID does not exist" Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.563903 5036 scope.go:117] "RemoveContainer" containerID="e26107c10d9c4c743bf686de9d7e1a50a6b7f23bce877c902d1af348e25f2ef2" Jan 10 16:29:48 crc kubenswrapper[5036]: E0110 16:29:48.564537 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e26107c10d9c4c743bf686de9d7e1a50a6b7f23bce877c902d1af348e25f2ef2\": container with ID starting with e26107c10d9c4c743bf686de9d7e1a50a6b7f23bce877c902d1af348e25f2ef2 not found: ID does not exist" containerID="e26107c10d9c4c743bf686de9d7e1a50a6b7f23bce877c902d1af348e25f2ef2" Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.564581 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e26107c10d9c4c743bf686de9d7e1a50a6b7f23bce877c902d1af348e25f2ef2"} err="failed to get container status \"e26107c10d9c4c743bf686de9d7e1a50a6b7f23bce877c902d1af348e25f2ef2\": rpc error: code = NotFound desc = could not find container \"e26107c10d9c4c743bf686de9d7e1a50a6b7f23bce877c902d1af348e25f2ef2\": container with ID starting with e26107c10d9c4c743bf686de9d7e1a50a6b7f23bce877c902d1af348e25f2ef2 not found: ID does not exist" Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.564615 5036 scope.go:117] "RemoveContainer" containerID="8d1c63c1d91eab3349633de5ff5f50cd7bde0343e7f590ed1f4b9f046509dd06" Jan 10 16:29:48 crc kubenswrapper[5036]: E0110 16:29:48.565136 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d1c63c1d91eab3349633de5ff5f50cd7bde0343e7f590ed1f4b9f046509dd06\": container with ID starting with 8d1c63c1d91eab3349633de5ff5f50cd7bde0343e7f590ed1f4b9f046509dd06 not found: ID does not exist" containerID="8d1c63c1d91eab3349633de5ff5f50cd7bde0343e7f590ed1f4b9f046509dd06" Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.565196 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d1c63c1d91eab3349633de5ff5f50cd7bde0343e7f590ed1f4b9f046509dd06"} err="failed to get container status \"8d1c63c1d91eab3349633de5ff5f50cd7bde0343e7f590ed1f4b9f046509dd06\": rpc error: code = NotFound desc = could not find container \"8d1c63c1d91eab3349633de5ff5f50cd7bde0343e7f590ed1f4b9f046509dd06\": container with ID starting with 8d1c63c1d91eab3349633de5ff5f50cd7bde0343e7f590ed1f4b9f046509dd06 not found: ID does not exist" Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.626496 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f0cd226-9f92-4ef2-a82b-7746983ab42e-catalog-content\") pod \"9f0cd226-9f92-4ef2-a82b-7746983ab42e\" (UID: \"9f0cd226-9f92-4ef2-a82b-7746983ab42e\") " Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.626590 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grqzm\" (UniqueName: \"kubernetes.io/projected/9f0cd226-9f92-4ef2-a82b-7746983ab42e-kube-api-access-grqzm\") pod \"9f0cd226-9f92-4ef2-a82b-7746983ab42e\" (UID: \"9f0cd226-9f92-4ef2-a82b-7746983ab42e\") " Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.626765 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f0cd226-9f92-4ef2-a82b-7746983ab42e-utilities\") pod \"9f0cd226-9f92-4ef2-a82b-7746983ab42e\" (UID: \"9f0cd226-9f92-4ef2-a82b-7746983ab42e\") " Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.627564 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f0cd226-9f92-4ef2-a82b-7746983ab42e-utilities" (OuterVolumeSpecName: "utilities") pod "9f0cd226-9f92-4ef2-a82b-7746983ab42e" (UID: "9f0cd226-9f92-4ef2-a82b-7746983ab42e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.634410 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f0cd226-9f92-4ef2-a82b-7746983ab42e-kube-api-access-grqzm" (OuterVolumeSpecName: "kube-api-access-grqzm") pod "9f0cd226-9f92-4ef2-a82b-7746983ab42e" (UID: "9f0cd226-9f92-4ef2-a82b-7746983ab42e"). InnerVolumeSpecName "kube-api-access-grqzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.660110 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f0cd226-9f92-4ef2-a82b-7746983ab42e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f0cd226-9f92-4ef2-a82b-7746983ab42e" (UID: "9f0cd226-9f92-4ef2-a82b-7746983ab42e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.728588 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f0cd226-9f92-4ef2-a82b-7746983ab42e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.728631 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grqzm\" (UniqueName: \"kubernetes.io/projected/9f0cd226-9f92-4ef2-a82b-7746983ab42e-kube-api-access-grqzm\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.728647 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f0cd226-9f92-4ef2-a82b-7746983ab42e-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.830035 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v2wzb"] Jan 10 16:29:48 crc kubenswrapper[5036]: I0110 16:29:48.833840 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v2wzb"] Jan 10 16:29:50 crc kubenswrapper[5036]: I0110 16:29:50.516693 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f0cd226-9f92-4ef2-a82b-7746983ab42e" path="/var/lib/kubelet/pods/9f0cd226-9f92-4ef2-a82b-7746983ab42e/volumes" Jan 10 16:29:51 crc kubenswrapper[5036]: I0110 16:29:51.708951 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hm8ns" Jan 10 16:29:51 crc kubenswrapper[5036]: I0110 16:29:51.755444 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hm8ns" Jan 10 16:29:51 crc kubenswrapper[5036]: I0110 16:29:51.919270 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-czknz" Jan 10 16:29:51 crc kubenswrapper[5036]: I0110 16:29:51.971651 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-czknz" Jan 10 16:29:52 crc kubenswrapper[5036]: I0110 16:29:52.116937 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lsfx8" Jan 10 16:29:52 crc kubenswrapper[5036]: I0110 16:29:52.167529 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lsfx8" Jan 10 16:29:53 crc kubenswrapper[5036]: I0110 16:29:53.647539 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vzvbk" Jan 10 16:29:53 crc kubenswrapper[5036]: I0110 16:29:53.647629 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vzvbk" Jan 10 16:29:53 crc kubenswrapper[5036]: I0110 16:29:53.689725 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vzvbk" Jan 10 16:29:53 crc kubenswrapper[5036]: I0110 16:29:53.961004 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lsfx8"] Jan 10 16:29:53 crc kubenswrapper[5036]: I0110 16:29:53.961297 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lsfx8" podUID="ea0d5867-9889-49bd-b23e-545606295a7a" containerName="registry-server" containerID="cri-o://fd8a828ed95a2eef22cf778e07a1bdb83f9f42736c0b194a382f560981080587" gracePeriod=2 Jan 10 16:29:54 crc kubenswrapper[5036]: I0110 16:29:54.157113 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-czknz"] Jan 10 16:29:54 crc kubenswrapper[5036]: I0110 16:29:54.157411 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-czknz" podUID="3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9" containerName="registry-server" containerID="cri-o://6404654d6ee045b6e3428d393ec1a4f40dde0bdf72872464398c4fffb444434c" gracePeriod=2 Jan 10 16:29:54 crc kubenswrapper[5036]: I0110 16:29:54.601996 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vzvbk" Jan 10 16:29:54 crc kubenswrapper[5036]: I0110 16:29:54.670031 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q72r7" Jan 10 16:29:54 crc kubenswrapper[5036]: I0110 16:29:54.712581 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q72r7" Jan 10 16:29:55 crc kubenswrapper[5036]: I0110 16:29:55.017930 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8qz7l" Jan 10 16:29:55 crc kubenswrapper[5036]: I0110 16:29:55.017996 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8qz7l" Jan 10 16:29:55 crc kubenswrapper[5036]: I0110 16:29:55.091743 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8qz7l" Jan 10 16:29:55 crc kubenswrapper[5036]: I0110 16:29:55.556404 5036 generic.go:334] "Generic (PLEG): container finished" podID="3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9" containerID="6404654d6ee045b6e3428d393ec1a4f40dde0bdf72872464398c4fffb444434c" exitCode=0 Jan 10 16:29:55 crc kubenswrapper[5036]: I0110 16:29:55.556523 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-czknz" event={"ID":"3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9","Type":"ContainerDied","Data":"6404654d6ee045b6e3428d393ec1a4f40dde0bdf72872464398c4fffb444434c"} Jan 10 16:29:55 crc kubenswrapper[5036]: I0110 16:29:55.559251 5036 generic.go:334] "Generic (PLEG): container finished" podID="ea0d5867-9889-49bd-b23e-545606295a7a" containerID="fd8a828ed95a2eef22cf778e07a1bdb83f9f42736c0b194a382f560981080587" exitCode=0 Jan 10 16:29:55 crc kubenswrapper[5036]: I0110 16:29:55.559452 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsfx8" event={"ID":"ea0d5867-9889-49bd-b23e-545606295a7a","Type":"ContainerDied","Data":"fd8a828ed95a2eef22cf778e07a1bdb83f9f42736c0b194a382f560981080587"} Jan 10 16:29:55 crc kubenswrapper[5036]: I0110 16:29:55.574984 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw"] Jan 10 16:29:55 crc kubenswrapper[5036]: I0110 16:29:55.575773 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" podUID="4428fbb6-3ebd-48cf-8186-002053b880cd" containerName="controller-manager" containerID="cri-o://9f4708e992a7b293e5414dfc878f14ef1d02787c7f71a6108911e0300f5aa842" gracePeriod=30 Jan 10 16:29:55 crc kubenswrapper[5036]: I0110 16:29:55.614021 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8qz7l" Jan 10 16:29:55 crc kubenswrapper[5036]: I0110 16:29:55.671413 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8"] Jan 10 16:29:55 crc kubenswrapper[5036]: I0110 16:29:55.671663 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8" podUID="beab962f-cd17-4571-be06-3df435921898" containerName="route-controller-manager" containerID="cri-o://fa4021f1842d2e90cf9e790f772bba3684246177d018b3f5c25541e58421f4e3" gracePeriod=30 Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.521791 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-czknz" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.525095 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lsfx8" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.574468 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsfx8" event={"ID":"ea0d5867-9889-49bd-b23e-545606295a7a","Type":"ContainerDied","Data":"7eb1372bb4f38659f5b9cce51f162d06cf5d3b2dd27d69562020d5bbc6f8a7b9"} Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.574535 5036 scope.go:117] "RemoveContainer" containerID="fd8a828ed95a2eef22cf778e07a1bdb83f9f42736c0b194a382f560981080587" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.574580 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lsfx8" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.577469 5036 generic.go:334] "Generic (PLEG): container finished" podID="4428fbb6-3ebd-48cf-8186-002053b880cd" containerID="9f4708e992a7b293e5414dfc878f14ef1d02787c7f71a6108911e0300f5aa842" exitCode=0 Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.577613 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" event={"ID":"4428fbb6-3ebd-48cf-8186-002053b880cd","Type":"ContainerDied","Data":"9f4708e992a7b293e5414dfc878f14ef1d02787c7f71a6108911e0300f5aa842"} Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.581364 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-czknz" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.581457 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-czknz" event={"ID":"3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9","Type":"ContainerDied","Data":"dcdc83151b5c40ecda1d0738d7370b3ab6507acf76d50edd224d77db33ac104f"} Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.583846 5036 generic.go:334] "Generic (PLEG): container finished" podID="beab962f-cd17-4571-be06-3df435921898" containerID="fa4021f1842d2e90cf9e790f772bba3684246177d018b3f5c25541e58421f4e3" exitCode=0 Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.584162 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8" event={"ID":"beab962f-cd17-4571-be06-3df435921898","Type":"ContainerDied","Data":"fa4021f1842d2e90cf9e790f772bba3684246177d018b3f5c25541e58421f4e3"} Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.604349 5036 scope.go:117] "RemoveContainer" containerID="5b33f13f3be25151c0fb0bf24c0ffe45edd5991b4fc54f8feca321adcd55d48c" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.628221 5036 scope.go:117] "RemoveContainer" containerID="0d3fa54785d71a6ebcdc054c6f935cb5361813444368855e0226cec9ed56733c" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.643871 5036 scope.go:117] "RemoveContainer" containerID="6404654d6ee045b6e3428d393ec1a4f40dde0bdf72872464398c4fffb444434c" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.658255 5036 scope.go:117] "RemoveContainer" containerID="0e37436c1a1dc0f3042a894790da2fdc775a75f4694f9a223bd66705190174ca" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.673194 5036 scope.go:117] "RemoveContainer" containerID="ee4c415993b049cf1be4535dff56f2ffb5448baa6ec368dc6f8cac7022f6a89a" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.673606 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvk7n\" (UniqueName: \"kubernetes.io/projected/3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9-kube-api-access-vvk7n\") pod \"3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9\" (UID: \"3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9\") " Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.673728 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9-utilities\") pod \"3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9\" (UID: \"3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9\") " Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.673756 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0d5867-9889-49bd-b23e-545606295a7a-utilities\") pod \"ea0d5867-9889-49bd-b23e-545606295a7a\" (UID: \"ea0d5867-9889-49bd-b23e-545606295a7a\") " Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.673779 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0d5867-9889-49bd-b23e-545606295a7a-catalog-content\") pod \"ea0d5867-9889-49bd-b23e-545606295a7a\" (UID: \"ea0d5867-9889-49bd-b23e-545606295a7a\") " Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.673808 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wq8hl\" (UniqueName: \"kubernetes.io/projected/ea0d5867-9889-49bd-b23e-545606295a7a-kube-api-access-wq8hl\") pod \"ea0d5867-9889-49bd-b23e-545606295a7a\" (UID: \"ea0d5867-9889-49bd-b23e-545606295a7a\") " Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.673884 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9-catalog-content\") pod \"3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9\" (UID: \"3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9\") " Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.674811 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9-utilities" (OuterVolumeSpecName: "utilities") pod "3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9" (UID: "3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.675267 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea0d5867-9889-49bd-b23e-545606295a7a-utilities" (OuterVolumeSpecName: "utilities") pod "ea0d5867-9889-49bd-b23e-545606295a7a" (UID: "ea0d5867-9889-49bd-b23e-545606295a7a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.682506 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea0d5867-9889-49bd-b23e-545606295a7a-kube-api-access-wq8hl" (OuterVolumeSpecName: "kube-api-access-wq8hl") pod "ea0d5867-9889-49bd-b23e-545606295a7a" (UID: "ea0d5867-9889-49bd-b23e-545606295a7a"). InnerVolumeSpecName "kube-api-access-wq8hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.690908 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9-kube-api-access-vvk7n" (OuterVolumeSpecName: "kube-api-access-vvk7n") pod "3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9" (UID: "3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9"). InnerVolumeSpecName "kube-api-access-vvk7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.725214 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea0d5867-9889-49bd-b23e-545606295a7a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea0d5867-9889-49bd-b23e-545606295a7a" (UID: "ea0d5867-9889-49bd-b23e-545606295a7a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.729084 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9" (UID: "3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.776246 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvk7n\" (UniqueName: \"kubernetes.io/projected/3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9-kube-api-access-vvk7n\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.776654 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.776759 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0d5867-9889-49bd-b23e-545606295a7a-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.776828 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0d5867-9889-49bd-b23e-545606295a7a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.776932 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wq8hl\" (UniqueName: \"kubernetes.io/projected/ea0d5867-9889-49bd-b23e-545606295a7a-kube-api-access-wq8hl\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.777023 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.824558 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.878582 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4428fbb6-3ebd-48cf-8186-002053b880cd-client-ca\") pod \"4428fbb6-3ebd-48cf-8186-002053b880cd\" (UID: \"4428fbb6-3ebd-48cf-8186-002053b880cd\") " Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.878653 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7lnh\" (UniqueName: \"kubernetes.io/projected/4428fbb6-3ebd-48cf-8186-002053b880cd-kube-api-access-g7lnh\") pod \"4428fbb6-3ebd-48cf-8186-002053b880cd\" (UID: \"4428fbb6-3ebd-48cf-8186-002053b880cd\") " Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.878706 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4428fbb6-3ebd-48cf-8186-002053b880cd-config\") pod \"4428fbb6-3ebd-48cf-8186-002053b880cd\" (UID: \"4428fbb6-3ebd-48cf-8186-002053b880cd\") " Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.878742 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4428fbb6-3ebd-48cf-8186-002053b880cd-serving-cert\") pod \"4428fbb6-3ebd-48cf-8186-002053b880cd\" (UID: \"4428fbb6-3ebd-48cf-8186-002053b880cd\") " Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.878770 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4428fbb6-3ebd-48cf-8186-002053b880cd-proxy-ca-bundles\") pod \"4428fbb6-3ebd-48cf-8186-002053b880cd\" (UID: \"4428fbb6-3ebd-48cf-8186-002053b880cd\") " Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.879604 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4428fbb6-3ebd-48cf-8186-002053b880cd-client-ca" (OuterVolumeSpecName: "client-ca") pod "4428fbb6-3ebd-48cf-8186-002053b880cd" (UID: "4428fbb6-3ebd-48cf-8186-002053b880cd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.879596 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4428fbb6-3ebd-48cf-8186-002053b880cd-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4428fbb6-3ebd-48cf-8186-002053b880cd" (UID: "4428fbb6-3ebd-48cf-8186-002053b880cd"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.879731 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4428fbb6-3ebd-48cf-8186-002053b880cd-config" (OuterVolumeSpecName: "config") pod "4428fbb6-3ebd-48cf-8186-002053b880cd" (UID: "4428fbb6-3ebd-48cf-8186-002053b880cd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.883418 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4428fbb6-3ebd-48cf-8186-002053b880cd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4428fbb6-3ebd-48cf-8186-002053b880cd" (UID: "4428fbb6-3ebd-48cf-8186-002053b880cd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.883712 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4428fbb6-3ebd-48cf-8186-002053b880cd-kube-api-access-g7lnh" (OuterVolumeSpecName: "kube-api-access-g7lnh") pod "4428fbb6-3ebd-48cf-8186-002053b880cd" (UID: "4428fbb6-3ebd-48cf-8186-002053b880cd"). InnerVolumeSpecName "kube-api-access-g7lnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.924150 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-czknz"] Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.929271 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-czknz"] Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.942105 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lsfx8"] Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.945123 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lsfx8"] Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.980596 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4428fbb6-3ebd-48cf-8186-002053b880cd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.980625 5036 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4428fbb6-3ebd-48cf-8186-002053b880cd-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.980635 5036 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4428fbb6-3ebd-48cf-8186-002053b880cd-client-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.980644 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7lnh\" (UniqueName: \"kubernetes.io/projected/4428fbb6-3ebd-48cf-8186-002053b880cd-kube-api-access-g7lnh\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:56 crc kubenswrapper[5036]: I0110 16:29:56.980656 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4428fbb6-3ebd-48cf-8186-002053b880cd-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:57 crc kubenswrapper[5036]: I0110 16:29:57.610947 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" event={"ID":"4428fbb6-3ebd-48cf-8186-002053b880cd","Type":"ContainerDied","Data":"53786066f6b85032c430dbb79d73abbb66d80bcfc282425612eda0c6e4597f14"} Jan 10 16:29:57 crc kubenswrapper[5036]: I0110 16:29:57.611063 5036 scope.go:117] "RemoveContainer" containerID="9f4708e992a7b293e5414dfc878f14ef1d02787c7f71a6108911e0300f5aa842" Jan 10 16:29:57 crc kubenswrapper[5036]: I0110 16:29:57.611105 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw" Jan 10 16:29:57 crc kubenswrapper[5036]: I0110 16:29:57.675804 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw"] Jan 10 16:29:57 crc kubenswrapper[5036]: I0110 16:29:57.681622 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-c5d4c95dc-zjqgw"] Jan 10 16:29:57 crc kubenswrapper[5036]: I0110 16:29:57.870703 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8" Jan 10 16:29:57 crc kubenswrapper[5036]: I0110 16:29:57.995445 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beab962f-cd17-4571-be06-3df435921898-serving-cert\") pod \"beab962f-cd17-4571-be06-3df435921898\" (UID: \"beab962f-cd17-4571-be06-3df435921898\") " Jan 10 16:29:57 crc kubenswrapper[5036]: I0110 16:29:57.995972 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7v9x\" (UniqueName: \"kubernetes.io/projected/beab962f-cd17-4571-be06-3df435921898-kube-api-access-d7v9x\") pod \"beab962f-cd17-4571-be06-3df435921898\" (UID: \"beab962f-cd17-4571-be06-3df435921898\") " Jan 10 16:29:57 crc kubenswrapper[5036]: I0110 16:29:57.996007 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beab962f-cd17-4571-be06-3df435921898-config\") pod \"beab962f-cd17-4571-be06-3df435921898\" (UID: \"beab962f-cd17-4571-be06-3df435921898\") " Jan 10 16:29:57 crc kubenswrapper[5036]: I0110 16:29:57.996050 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/beab962f-cd17-4571-be06-3df435921898-client-ca\") pod \"beab962f-cd17-4571-be06-3df435921898\" (UID: \"beab962f-cd17-4571-be06-3df435921898\") " Jan 10 16:29:57 crc kubenswrapper[5036]: I0110 16:29:57.996724 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beab962f-cd17-4571-be06-3df435921898-client-ca" (OuterVolumeSpecName: "client-ca") pod "beab962f-cd17-4571-be06-3df435921898" (UID: "beab962f-cd17-4571-be06-3df435921898"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:29:57 crc kubenswrapper[5036]: I0110 16:29:57.996875 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/beab962f-cd17-4571-be06-3df435921898-config" (OuterVolumeSpecName: "config") pod "beab962f-cd17-4571-be06-3df435921898" (UID: "beab962f-cd17-4571-be06-3df435921898"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:29:57 crc kubenswrapper[5036]: I0110 16:29:57.998926 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beab962f-cd17-4571-be06-3df435921898-kube-api-access-d7v9x" (OuterVolumeSpecName: "kube-api-access-d7v9x") pod "beab962f-cd17-4571-be06-3df435921898" (UID: "beab962f-cd17-4571-be06-3df435921898"). InnerVolumeSpecName "kube-api-access-d7v9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:29:57 crc kubenswrapper[5036]: I0110 16:29:57.999276 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beab962f-cd17-4571-be06-3df435921898-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "beab962f-cd17-4571-be06-3df435921898" (UID: "beab962f-cd17-4571-be06-3df435921898"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.096728 5036 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/beab962f-cd17-4571-be06-3df435921898-client-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.096779 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/beab962f-cd17-4571-be06-3df435921898-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.096793 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7v9x\" (UniqueName: \"kubernetes.io/projected/beab962f-cd17-4571-be06-3df435921898-kube-api-access-d7v9x\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.096808 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/beab962f-cd17-4571-be06-3df435921898-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.515859 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9" path="/var/lib/kubelet/pods/3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9/volumes" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.517398 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4428fbb6-3ebd-48cf-8186-002053b880cd" path="/var/lib/kubelet/pods/4428fbb6-3ebd-48cf-8186-002053b880cd/volumes" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.517984 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea0d5867-9889-49bd-b23e-545606295a7a" path="/var/lib/kubelet/pods/ea0d5867-9889-49bd-b23e-545606295a7a/volumes" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.620564 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8" event={"ID":"beab962f-cd17-4571-be06-3df435921898","Type":"ContainerDied","Data":"306788ae69f5510c532c6526fbb05e398cc4527fa7227ec845f1313f64bd19cc"} Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.620631 5036 scope.go:117] "RemoveContainer" containerID="fa4021f1842d2e90cf9e790f772bba3684246177d018b3f5c25541e58421f4e3" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.620755 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.655381 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8"] Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.663527 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9d85fcf98-226m8"] Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.813279 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5c5dff577d-96tm8"] Jan 10 16:29:58 crc kubenswrapper[5036]: E0110 16:29:58.813540 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9" containerName="extract-content" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.813552 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9" containerName="extract-content" Jan 10 16:29:58 crc kubenswrapper[5036]: E0110 16:29:58.813560 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20042ba4-9f8c-48b8-85e4-97d8ea2ad51b" containerName="pruner" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.813567 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="20042ba4-9f8c-48b8-85e4-97d8ea2ad51b" containerName="pruner" Jan 10 16:29:58 crc kubenswrapper[5036]: E0110 16:29:58.813575 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0cd226-9f92-4ef2-a82b-7746983ab42e" containerName="extract-content" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.813581 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0cd226-9f92-4ef2-a82b-7746983ab42e" containerName="extract-content" Jan 10 16:29:58 crc kubenswrapper[5036]: E0110 16:29:58.813592 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea0d5867-9889-49bd-b23e-545606295a7a" containerName="extract-content" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.813598 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0d5867-9889-49bd-b23e-545606295a7a" containerName="extract-content" Jan 10 16:29:58 crc kubenswrapper[5036]: E0110 16:29:58.813604 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0cd226-9f92-4ef2-a82b-7746983ab42e" containerName="extract-utilities" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.813610 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0cd226-9f92-4ef2-a82b-7746983ab42e" containerName="extract-utilities" Jan 10 16:29:58 crc kubenswrapper[5036]: E0110 16:29:58.813618 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beab962f-cd17-4571-be06-3df435921898" containerName="route-controller-manager" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.813623 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="beab962f-cd17-4571-be06-3df435921898" containerName="route-controller-manager" Jan 10 16:29:58 crc kubenswrapper[5036]: E0110 16:29:58.813632 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0cd226-9f92-4ef2-a82b-7746983ab42e" containerName="registry-server" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.813637 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0cd226-9f92-4ef2-a82b-7746983ab42e" containerName="registry-server" Jan 10 16:29:58 crc kubenswrapper[5036]: E0110 16:29:58.813645 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9" containerName="registry-server" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.813651 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9" containerName="registry-server" Jan 10 16:29:58 crc kubenswrapper[5036]: E0110 16:29:58.813662 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4428fbb6-3ebd-48cf-8186-002053b880cd" containerName="controller-manager" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.813668 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="4428fbb6-3ebd-48cf-8186-002053b880cd" containerName="controller-manager" Jan 10 16:29:58 crc kubenswrapper[5036]: E0110 16:29:58.813674 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9" containerName="extract-utilities" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.813695 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9" containerName="extract-utilities" Jan 10 16:29:58 crc kubenswrapper[5036]: E0110 16:29:58.813704 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea0d5867-9889-49bd-b23e-545606295a7a" containerName="extract-utilities" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.813710 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0d5867-9889-49bd-b23e-545606295a7a" containerName="extract-utilities" Jan 10 16:29:58 crc kubenswrapper[5036]: E0110 16:29:58.813716 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea0d5867-9889-49bd-b23e-545606295a7a" containerName="registry-server" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.813722 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0d5867-9889-49bd-b23e-545606295a7a" containerName="registry-server" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.813817 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="20042ba4-9f8c-48b8-85e4-97d8ea2ad51b" containerName="pruner" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.813825 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="4428fbb6-3ebd-48cf-8186-002053b880cd" containerName="controller-manager" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.813834 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0cd226-9f92-4ef2-a82b-7746983ab42e" containerName="registry-server" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.813842 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="3be77d88-86d7-4bd2-9c58-a5d0cdf5ebb9" containerName="registry-server" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.813849 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea0d5867-9889-49bd-b23e-545606295a7a" containerName="registry-server" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.813861 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="beab962f-cd17-4571-be06-3df435921898" containerName="route-controller-manager" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.814251 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.817493 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.817629 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.818460 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.818803 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.818924 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.819491 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.828541 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.836718 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c5dff577d-96tm8"] Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.961950 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8qz7l"] Jan 10 16:29:58 crc kubenswrapper[5036]: I0110 16:29:58.962467 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8qz7l" podUID="963d9e81-5aca-4e34-b326-ffb47bcf98ba" containerName="registry-server" containerID="cri-o://4f03b740a6ecb606caed7eea9820efeca6a80faf8df0137c40b1e60fe4117b93" gracePeriod=2 Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.008934 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7165c35-9b98-4950-bd64-a6cdb0454463-client-ca\") pod \"controller-manager-5c5dff577d-96tm8\" (UID: \"e7165c35-9b98-4950-bd64-a6cdb0454463\") " pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.008997 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7165c35-9b98-4950-bd64-a6cdb0454463-serving-cert\") pod \"controller-manager-5c5dff577d-96tm8\" (UID: \"e7165c35-9b98-4950-bd64-a6cdb0454463\") " pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.009115 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lrg5\" (UniqueName: \"kubernetes.io/projected/e7165c35-9b98-4950-bd64-a6cdb0454463-kube-api-access-7lrg5\") pod \"controller-manager-5c5dff577d-96tm8\" (UID: \"e7165c35-9b98-4950-bd64-a6cdb0454463\") " pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.009159 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7165c35-9b98-4950-bd64-a6cdb0454463-config\") pod \"controller-manager-5c5dff577d-96tm8\" (UID: \"e7165c35-9b98-4950-bd64-a6cdb0454463\") " pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.009210 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7165c35-9b98-4950-bd64-a6cdb0454463-proxy-ca-bundles\") pod \"controller-manager-5c5dff577d-96tm8\" (UID: \"e7165c35-9b98-4950-bd64-a6cdb0454463\") " pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.110440 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7165c35-9b98-4950-bd64-a6cdb0454463-config\") pod \"controller-manager-5c5dff577d-96tm8\" (UID: \"e7165c35-9b98-4950-bd64-a6cdb0454463\") " pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.110561 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7165c35-9b98-4950-bd64-a6cdb0454463-proxy-ca-bundles\") pod \"controller-manager-5c5dff577d-96tm8\" (UID: \"e7165c35-9b98-4950-bd64-a6cdb0454463\") " pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.110614 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7165c35-9b98-4950-bd64-a6cdb0454463-client-ca\") pod \"controller-manager-5c5dff577d-96tm8\" (UID: \"e7165c35-9b98-4950-bd64-a6cdb0454463\") " pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.110666 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7165c35-9b98-4950-bd64-a6cdb0454463-serving-cert\") pod \"controller-manager-5c5dff577d-96tm8\" (UID: \"e7165c35-9b98-4950-bd64-a6cdb0454463\") " pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.111831 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lrg5\" (UniqueName: \"kubernetes.io/projected/e7165c35-9b98-4950-bd64-a6cdb0454463-kube-api-access-7lrg5\") pod \"controller-manager-5c5dff577d-96tm8\" (UID: \"e7165c35-9b98-4950-bd64-a6cdb0454463\") " pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.112409 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7165c35-9b98-4950-bd64-a6cdb0454463-config\") pod \"controller-manager-5c5dff577d-96tm8\" (UID: \"e7165c35-9b98-4950-bd64-a6cdb0454463\") " pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.112495 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7165c35-9b98-4950-bd64-a6cdb0454463-proxy-ca-bundles\") pod \"controller-manager-5c5dff577d-96tm8\" (UID: \"e7165c35-9b98-4950-bd64-a6cdb0454463\") " pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.112644 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7165c35-9b98-4950-bd64-a6cdb0454463-client-ca\") pod \"controller-manager-5c5dff577d-96tm8\" (UID: \"e7165c35-9b98-4950-bd64-a6cdb0454463\") " pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.120153 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7165c35-9b98-4950-bd64-a6cdb0454463-serving-cert\") pod \"controller-manager-5c5dff577d-96tm8\" (UID: \"e7165c35-9b98-4950-bd64-a6cdb0454463\") " pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.146601 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lrg5\" (UniqueName: \"kubernetes.io/projected/e7165c35-9b98-4950-bd64-a6cdb0454463-kube-api-access-7lrg5\") pod \"controller-manager-5c5dff577d-96tm8\" (UID: \"e7165c35-9b98-4950-bd64-a6cdb0454463\") " pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.440057 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.810651 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9"] Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.811977 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.814188 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.814807 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.815001 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.815157 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.815476 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.815749 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.820514 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd9b9\" (UniqueName: \"kubernetes.io/projected/78b8ad26-c462-4fed-bb9d-98bf88363c35-kube-api-access-nd9b9\") pod \"route-controller-manager-7d4b946969-gjjz9\" (UID: \"78b8ad26-c462-4fed-bb9d-98bf88363c35\") " pod="openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.820671 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78b8ad26-c462-4fed-bb9d-98bf88363c35-client-ca\") pod \"route-controller-manager-7d4b946969-gjjz9\" (UID: \"78b8ad26-c462-4fed-bb9d-98bf88363c35\") " pod="openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.820894 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78b8ad26-c462-4fed-bb9d-98bf88363c35-config\") pod \"route-controller-manager-7d4b946969-gjjz9\" (UID: \"78b8ad26-c462-4fed-bb9d-98bf88363c35\") " pod="openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.820962 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78b8ad26-c462-4fed-bb9d-98bf88363c35-serving-cert\") pod \"route-controller-manager-7d4b946969-gjjz9\" (UID: \"78b8ad26-c462-4fed-bb9d-98bf88363c35\") " pod="openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.824119 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9"] Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.888239 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c5dff577d-96tm8"] Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.922360 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd9b9\" (UniqueName: \"kubernetes.io/projected/78b8ad26-c462-4fed-bb9d-98bf88363c35-kube-api-access-nd9b9\") pod \"route-controller-manager-7d4b946969-gjjz9\" (UID: \"78b8ad26-c462-4fed-bb9d-98bf88363c35\") " pod="openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.922456 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78b8ad26-c462-4fed-bb9d-98bf88363c35-client-ca\") pod \"route-controller-manager-7d4b946969-gjjz9\" (UID: \"78b8ad26-c462-4fed-bb9d-98bf88363c35\") " pod="openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.922494 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78b8ad26-c462-4fed-bb9d-98bf88363c35-config\") pod \"route-controller-manager-7d4b946969-gjjz9\" (UID: \"78b8ad26-c462-4fed-bb9d-98bf88363c35\") " pod="openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.922517 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78b8ad26-c462-4fed-bb9d-98bf88363c35-serving-cert\") pod \"route-controller-manager-7d4b946969-gjjz9\" (UID: \"78b8ad26-c462-4fed-bb9d-98bf88363c35\") " pod="openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.923819 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78b8ad26-c462-4fed-bb9d-98bf88363c35-config\") pod \"route-controller-manager-7d4b946969-gjjz9\" (UID: \"78b8ad26-c462-4fed-bb9d-98bf88363c35\") " pod="openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.923823 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78b8ad26-c462-4fed-bb9d-98bf88363c35-client-ca\") pod \"route-controller-manager-7d4b946969-gjjz9\" (UID: \"78b8ad26-c462-4fed-bb9d-98bf88363c35\") " pod="openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.928046 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78b8ad26-c462-4fed-bb9d-98bf88363c35-serving-cert\") pod \"route-controller-manager-7d4b946969-gjjz9\" (UID: \"78b8ad26-c462-4fed-bb9d-98bf88363c35\") " pod="openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9" Jan 10 16:29:59 crc kubenswrapper[5036]: I0110 16:29:59.975311 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd9b9\" (UniqueName: \"kubernetes.io/projected/78b8ad26-c462-4fed-bb9d-98bf88363c35-kube-api-access-nd9b9\") pod \"route-controller-manager-7d4b946969-gjjz9\" (UID: \"78b8ad26-c462-4fed-bb9d-98bf88363c35\") " pod="openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.128299 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.164650 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29467710-wr8z2"] Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.165853 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29467710-wr8z2" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.171060 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.171378 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.186175 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29467710-wr8z2"] Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.304836 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8qz7l" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.327694 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrjbz\" (UniqueName: \"kubernetes.io/projected/963d9e81-5aca-4e34-b326-ffb47bcf98ba-kube-api-access-mrjbz\") pod \"963d9e81-5aca-4e34-b326-ffb47bcf98ba\" (UID: \"963d9e81-5aca-4e34-b326-ffb47bcf98ba\") " Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.327738 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/963d9e81-5aca-4e34-b326-ffb47bcf98ba-catalog-content\") pod \"963d9e81-5aca-4e34-b326-ffb47bcf98ba\" (UID: \"963d9e81-5aca-4e34-b326-ffb47bcf98ba\") " Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.327797 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/963d9e81-5aca-4e34-b326-ffb47bcf98ba-utilities\") pod \"963d9e81-5aca-4e34-b326-ffb47bcf98ba\" (UID: \"963d9e81-5aca-4e34-b326-ffb47bcf98ba\") " Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.327891 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffa416ec-eaf9-430f-9ada-2b4dd73c76ca-config-volume\") pod \"collect-profiles-29467710-wr8z2\" (UID: \"ffa416ec-eaf9-430f-9ada-2b4dd73c76ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467710-wr8z2" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.327931 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t89l5\" (UniqueName: \"kubernetes.io/projected/ffa416ec-eaf9-430f-9ada-2b4dd73c76ca-kube-api-access-t89l5\") pod \"collect-profiles-29467710-wr8z2\" (UID: \"ffa416ec-eaf9-430f-9ada-2b4dd73c76ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467710-wr8z2" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.327954 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffa416ec-eaf9-430f-9ada-2b4dd73c76ca-secret-volume\") pod \"collect-profiles-29467710-wr8z2\" (UID: \"ffa416ec-eaf9-430f-9ada-2b4dd73c76ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467710-wr8z2" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.339590 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/963d9e81-5aca-4e34-b326-ffb47bcf98ba-kube-api-access-mrjbz" (OuterVolumeSpecName: "kube-api-access-mrjbz") pod "963d9e81-5aca-4e34-b326-ffb47bcf98ba" (UID: "963d9e81-5aca-4e34-b326-ffb47bcf98ba"). InnerVolumeSpecName "kube-api-access-mrjbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.341075 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/963d9e81-5aca-4e34-b326-ffb47bcf98ba-utilities" (OuterVolumeSpecName: "utilities") pod "963d9e81-5aca-4e34-b326-ffb47bcf98ba" (UID: "963d9e81-5aca-4e34-b326-ffb47bcf98ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.431597 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffa416ec-eaf9-430f-9ada-2b4dd73c76ca-config-volume\") pod \"collect-profiles-29467710-wr8z2\" (UID: \"ffa416ec-eaf9-430f-9ada-2b4dd73c76ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467710-wr8z2" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.431668 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t89l5\" (UniqueName: \"kubernetes.io/projected/ffa416ec-eaf9-430f-9ada-2b4dd73c76ca-kube-api-access-t89l5\") pod \"collect-profiles-29467710-wr8z2\" (UID: \"ffa416ec-eaf9-430f-9ada-2b4dd73c76ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467710-wr8z2" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.431719 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffa416ec-eaf9-430f-9ada-2b4dd73c76ca-secret-volume\") pod \"collect-profiles-29467710-wr8z2\" (UID: \"ffa416ec-eaf9-430f-9ada-2b4dd73c76ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467710-wr8z2" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.431772 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrjbz\" (UniqueName: \"kubernetes.io/projected/963d9e81-5aca-4e34-b326-ffb47bcf98ba-kube-api-access-mrjbz\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.431783 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/963d9e81-5aca-4e34-b326-ffb47bcf98ba-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.432904 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffa416ec-eaf9-430f-9ada-2b4dd73c76ca-config-volume\") pod \"collect-profiles-29467710-wr8z2\" (UID: \"ffa416ec-eaf9-430f-9ada-2b4dd73c76ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467710-wr8z2" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.438168 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffa416ec-eaf9-430f-9ada-2b4dd73c76ca-secret-volume\") pod \"collect-profiles-29467710-wr8z2\" (UID: \"ffa416ec-eaf9-430f-9ada-2b4dd73c76ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467710-wr8z2" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.461108 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t89l5\" (UniqueName: \"kubernetes.io/projected/ffa416ec-eaf9-430f-9ada-2b4dd73c76ca-kube-api-access-t89l5\") pod \"collect-profiles-29467710-wr8z2\" (UID: \"ffa416ec-eaf9-430f-9ada-2b4dd73c76ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467710-wr8z2" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.475234 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9"] Jan 10 16:30:00 crc kubenswrapper[5036]: W0110 16:30:00.487777 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78b8ad26_c462_4fed_bb9d_98bf88363c35.slice/crio-cff03df4e6249e960742ea3ea50a86047daec36f922a2fdf6dab41d1628791b6 WatchSource:0}: Error finding container cff03df4e6249e960742ea3ea50a86047daec36f922a2fdf6dab41d1628791b6: Status 404 returned error can't find the container with id cff03df4e6249e960742ea3ea50a86047daec36f922a2fdf6dab41d1628791b6 Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.495715 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/963d9e81-5aca-4e34-b326-ffb47bcf98ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "963d9e81-5aca-4e34-b326-ffb47bcf98ba" (UID: "963d9e81-5aca-4e34-b326-ffb47bcf98ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.517880 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beab962f-cd17-4571-be06-3df435921898" path="/var/lib/kubelet/pods/beab962f-cd17-4571-be06-3df435921898/volumes" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.519036 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29467710-wr8z2" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.533009 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/963d9e81-5aca-4e34-b326-ffb47bcf98ba-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.636017 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" event={"ID":"e7165c35-9b98-4950-bd64-a6cdb0454463","Type":"ContainerStarted","Data":"d4c1a31617a2e80a8e6151e973246c8b82778a5371b775a7afcd70097d2b5b00"} Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.636069 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" event={"ID":"e7165c35-9b98-4950-bd64-a6cdb0454463","Type":"ContainerStarted","Data":"61efbbbe47135e0b27729e943745d147db98a383f0a4f94b11c1233783c87265"} Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.636302 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.638436 5036 generic.go:334] "Generic (PLEG): container finished" podID="963d9e81-5aca-4e34-b326-ffb47bcf98ba" containerID="4f03b740a6ecb606caed7eea9820efeca6a80faf8df0137c40b1e60fe4117b93" exitCode=0 Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.638541 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qz7l" event={"ID":"963d9e81-5aca-4e34-b326-ffb47bcf98ba","Type":"ContainerDied","Data":"4f03b740a6ecb606caed7eea9820efeca6a80faf8df0137c40b1e60fe4117b93"} Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.638574 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8qz7l" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.638599 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qz7l" event={"ID":"963d9e81-5aca-4e34-b326-ffb47bcf98ba","Type":"ContainerDied","Data":"51d735fac832c36544bc72aa72579549ce1053afce368e295ff8c6fc0996121e"} Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.638623 5036 scope.go:117] "RemoveContainer" containerID="4f03b740a6ecb606caed7eea9820efeca6a80faf8df0137c40b1e60fe4117b93" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.641289 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9" event={"ID":"78b8ad26-c462-4fed-bb9d-98bf88363c35","Type":"ContainerStarted","Data":"cff03df4e6249e960742ea3ea50a86047daec36f922a2fdf6dab41d1628791b6"} Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.641603 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.658610 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" podStartSLOduration=5.658556008 podStartE2EDuration="5.658556008s" podCreationTimestamp="2026-01-10 16:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:30:00.654088699 +0000 UTC m=+122.524324193" watchObservedRunningTime="2026-01-10 16:30:00.658556008 +0000 UTC m=+122.528791502" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.673928 5036 scope.go:117] "RemoveContainer" containerID="f7e9f58a707c71fa69abec8941146961993888b267ace16b72e39b15a080948d" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.698015 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8qz7l"] Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.713064 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8qz7l"] Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.721849 5036 scope.go:117] "RemoveContainer" containerID="6f2db35f8417c37b10db412b93db32bf712c3f092e328d4bb0f028c96f093ca3" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.753973 5036 scope.go:117] "RemoveContainer" containerID="4f03b740a6ecb606caed7eea9820efeca6a80faf8df0137c40b1e60fe4117b93" Jan 10 16:30:00 crc kubenswrapper[5036]: E0110 16:30:00.754669 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f03b740a6ecb606caed7eea9820efeca6a80faf8df0137c40b1e60fe4117b93\": container with ID starting with 4f03b740a6ecb606caed7eea9820efeca6a80faf8df0137c40b1e60fe4117b93 not found: ID does not exist" containerID="4f03b740a6ecb606caed7eea9820efeca6a80faf8df0137c40b1e60fe4117b93" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.754743 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f03b740a6ecb606caed7eea9820efeca6a80faf8df0137c40b1e60fe4117b93"} err="failed to get container status \"4f03b740a6ecb606caed7eea9820efeca6a80faf8df0137c40b1e60fe4117b93\": rpc error: code = NotFound desc = could not find container \"4f03b740a6ecb606caed7eea9820efeca6a80faf8df0137c40b1e60fe4117b93\": container with ID starting with 4f03b740a6ecb606caed7eea9820efeca6a80faf8df0137c40b1e60fe4117b93 not found: ID does not exist" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.754782 5036 scope.go:117] "RemoveContainer" containerID="f7e9f58a707c71fa69abec8941146961993888b267ace16b72e39b15a080948d" Jan 10 16:30:00 crc kubenswrapper[5036]: E0110 16:30:00.756200 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7e9f58a707c71fa69abec8941146961993888b267ace16b72e39b15a080948d\": container with ID starting with f7e9f58a707c71fa69abec8941146961993888b267ace16b72e39b15a080948d not found: ID does not exist" containerID="f7e9f58a707c71fa69abec8941146961993888b267ace16b72e39b15a080948d" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.756242 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7e9f58a707c71fa69abec8941146961993888b267ace16b72e39b15a080948d"} err="failed to get container status \"f7e9f58a707c71fa69abec8941146961993888b267ace16b72e39b15a080948d\": rpc error: code = NotFound desc = could not find container \"f7e9f58a707c71fa69abec8941146961993888b267ace16b72e39b15a080948d\": container with ID starting with f7e9f58a707c71fa69abec8941146961993888b267ace16b72e39b15a080948d not found: ID does not exist" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.756270 5036 scope.go:117] "RemoveContainer" containerID="6f2db35f8417c37b10db412b93db32bf712c3f092e328d4bb0f028c96f093ca3" Jan 10 16:30:00 crc kubenswrapper[5036]: E0110 16:30:00.756899 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f2db35f8417c37b10db412b93db32bf712c3f092e328d4bb0f028c96f093ca3\": container with ID starting with 6f2db35f8417c37b10db412b93db32bf712c3f092e328d4bb0f028c96f093ca3 not found: ID does not exist" containerID="6f2db35f8417c37b10db412b93db32bf712c3f092e328d4bb0f028c96f093ca3" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.756938 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f2db35f8417c37b10db412b93db32bf712c3f092e328d4bb0f028c96f093ca3"} err="failed to get container status \"6f2db35f8417c37b10db412b93db32bf712c3f092e328d4bb0f028c96f093ca3\": rpc error: code = NotFound desc = could not find container \"6f2db35f8417c37b10db412b93db32bf712c3f092e328d4bb0f028c96f093ca3\": container with ID starting with 6f2db35f8417c37b10db412b93db32bf712c3f092e328d4bb0f028c96f093ca3 not found: ID does not exist" Jan 10 16:30:00 crc kubenswrapper[5036]: I0110 16:30:00.818102 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29467710-wr8z2"] Jan 10 16:30:00 crc kubenswrapper[5036]: W0110 16:30:00.826604 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffa416ec_eaf9_430f_9ada_2b4dd73c76ca.slice/crio-b27967afa21730dabbf69f0c3eb8f62eaf7126ba48286a2ea88fc9738dd38792 WatchSource:0}: Error finding container b27967afa21730dabbf69f0c3eb8f62eaf7126ba48286a2ea88fc9738dd38792: Status 404 returned error can't find the container with id b27967afa21730dabbf69f0c3eb8f62eaf7126ba48286a2ea88fc9738dd38792 Jan 10 16:30:01 crc kubenswrapper[5036]: I0110 16:30:01.650173 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9" event={"ID":"78b8ad26-c462-4fed-bb9d-98bf88363c35","Type":"ContainerStarted","Data":"3249e76be5524c02b154cc9cb5b0dde24ea2ed4a2e4e3ae7eeebc6c0679cea0b"} Jan 10 16:30:01 crc kubenswrapper[5036]: I0110 16:30:01.650700 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9" Jan 10 16:30:01 crc kubenswrapper[5036]: I0110 16:30:01.651839 5036 generic.go:334] "Generic (PLEG): container finished" podID="ffa416ec-eaf9-430f-9ada-2b4dd73c76ca" containerID="9437bee23ebc26628c7c421d7b8e6a3d87fef287bffc247d34d41ac077c8d3e2" exitCode=0 Jan 10 16:30:01 crc kubenswrapper[5036]: I0110 16:30:01.652272 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29467710-wr8z2" event={"ID":"ffa416ec-eaf9-430f-9ada-2b4dd73c76ca","Type":"ContainerDied","Data":"9437bee23ebc26628c7c421d7b8e6a3d87fef287bffc247d34d41ac077c8d3e2"} Jan 10 16:30:01 crc kubenswrapper[5036]: I0110 16:30:01.652303 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29467710-wr8z2" event={"ID":"ffa416ec-eaf9-430f-9ada-2b4dd73c76ca","Type":"ContainerStarted","Data":"b27967afa21730dabbf69f0c3eb8f62eaf7126ba48286a2ea88fc9738dd38792"} Jan 10 16:30:01 crc kubenswrapper[5036]: I0110 16:30:01.658542 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9" Jan 10 16:30:01 crc kubenswrapper[5036]: I0110 16:30:01.709556 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9" podStartSLOduration=6.709538866 podStartE2EDuration="6.709538866s" podCreationTimestamp="2026-01-10 16:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:30:01.689372178 +0000 UTC m=+123.559607692" watchObservedRunningTime="2026-01-10 16:30:01.709538866 +0000 UTC m=+123.579774360" Jan 10 16:30:01 crc kubenswrapper[5036]: I0110 16:30:01.934616 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8vvs"] Jan 10 16:30:02 crc kubenswrapper[5036]: I0110 16:30:02.515939 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="963d9e81-5aca-4e34-b326-ffb47bcf98ba" path="/var/lib/kubelet/pods/963d9e81-5aca-4e34-b326-ffb47bcf98ba/volumes" Jan 10 16:30:02 crc kubenswrapper[5036]: I0110 16:30:02.926777 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29467710-wr8z2" Jan 10 16:30:03 crc kubenswrapper[5036]: I0110 16:30:03.074294 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t89l5\" (UniqueName: \"kubernetes.io/projected/ffa416ec-eaf9-430f-9ada-2b4dd73c76ca-kube-api-access-t89l5\") pod \"ffa416ec-eaf9-430f-9ada-2b4dd73c76ca\" (UID: \"ffa416ec-eaf9-430f-9ada-2b4dd73c76ca\") " Jan 10 16:30:03 crc kubenswrapper[5036]: I0110 16:30:03.074370 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffa416ec-eaf9-430f-9ada-2b4dd73c76ca-secret-volume\") pod \"ffa416ec-eaf9-430f-9ada-2b4dd73c76ca\" (UID: \"ffa416ec-eaf9-430f-9ada-2b4dd73c76ca\") " Jan 10 16:30:03 crc kubenswrapper[5036]: I0110 16:30:03.074546 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffa416ec-eaf9-430f-9ada-2b4dd73c76ca-config-volume\") pod \"ffa416ec-eaf9-430f-9ada-2b4dd73c76ca\" (UID: \"ffa416ec-eaf9-430f-9ada-2b4dd73c76ca\") " Jan 10 16:30:03 crc kubenswrapper[5036]: I0110 16:30:03.075741 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffa416ec-eaf9-430f-9ada-2b4dd73c76ca-config-volume" (OuterVolumeSpecName: "config-volume") pod "ffa416ec-eaf9-430f-9ada-2b4dd73c76ca" (UID: "ffa416ec-eaf9-430f-9ada-2b4dd73c76ca"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:30:03 crc kubenswrapper[5036]: I0110 16:30:03.076113 5036 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffa416ec-eaf9-430f-9ada-2b4dd73c76ca-config-volume\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:03 crc kubenswrapper[5036]: I0110 16:30:03.080532 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffa416ec-eaf9-430f-9ada-2b4dd73c76ca-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ffa416ec-eaf9-430f-9ada-2b4dd73c76ca" (UID: "ffa416ec-eaf9-430f-9ada-2b4dd73c76ca"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:30:03 crc kubenswrapper[5036]: I0110 16:30:03.084829 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffa416ec-eaf9-430f-9ada-2b4dd73c76ca-kube-api-access-t89l5" (OuterVolumeSpecName: "kube-api-access-t89l5") pod "ffa416ec-eaf9-430f-9ada-2b4dd73c76ca" (UID: "ffa416ec-eaf9-430f-9ada-2b4dd73c76ca"). InnerVolumeSpecName "kube-api-access-t89l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:30:03 crc kubenswrapper[5036]: I0110 16:30:03.177312 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t89l5\" (UniqueName: \"kubernetes.io/projected/ffa416ec-eaf9-430f-9ada-2b4dd73c76ca-kube-api-access-t89l5\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:03 crc kubenswrapper[5036]: I0110 16:30:03.177388 5036 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ffa416ec-eaf9-430f-9ada-2b4dd73c76ca-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:03 crc kubenswrapper[5036]: I0110 16:30:03.665376 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29467710-wr8z2" event={"ID":"ffa416ec-eaf9-430f-9ada-2b4dd73c76ca","Type":"ContainerDied","Data":"b27967afa21730dabbf69f0c3eb8f62eaf7126ba48286a2ea88fc9738dd38792"} Jan 10 16:30:03 crc kubenswrapper[5036]: I0110 16:30:03.665439 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b27967afa21730dabbf69f0c3eb8f62eaf7126ba48286a2ea88fc9738dd38792" Jan 10 16:30:03 crc kubenswrapper[5036]: I0110 16:30:03.665585 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29467710-wr8z2" Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.793838 5036 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.794713 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://5c3d9b76028a6b1f6b025ecd7227387c6ac179e613bb01e8d8d2947a88be0515" gracePeriod=15 Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.794765 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://48aa6d8e0f00ddf9a6fdef1b8ae1ee9ff101082f5e7d871c81beaa68344edade" gracePeriod=15 Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.794792 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://6164714519a51fd12d13bbf0c74e2ed910fe7e9fb5fc21b0476fa946fc54c3bb" gracePeriod=15 Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.794812 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://f71e2aac540c8ebaf6eca7a56c30aa6f65c2c637c7efdfab7999d74ffc2ecf4e" gracePeriod=15 Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.799097 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://71912e3bacf35053ffa1c8590378aa9a0c88319533d888b0d191e4bce05ae764" gracePeriod=15 Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.803328 5036 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 10 16:30:05 crc kubenswrapper[5036]: E0110 16:30:05.803843 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa416ec-eaf9-430f-9ada-2b4dd73c76ca" containerName="collect-profiles" Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.803869 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa416ec-eaf9-430f-9ada-2b4dd73c76ca" containerName="collect-profiles" Jan 10 16:30:05 crc kubenswrapper[5036]: E0110 16:30:05.803908 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.803919 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 10 16:30:05 crc kubenswrapper[5036]: E0110 16:30:05.803936 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.803947 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 10 16:30:05 crc kubenswrapper[5036]: E0110 16:30:05.803968 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.803978 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 10 16:30:05 crc kubenswrapper[5036]: E0110 16:30:05.803986 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.803994 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 10 16:30:05 crc kubenswrapper[5036]: E0110 16:30:05.804024 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.804033 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 10 16:30:05 crc kubenswrapper[5036]: E0110 16:30:05.804045 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.804054 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 10 16:30:05 crc kubenswrapper[5036]: E0110 16:30:05.804073 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963d9e81-5aca-4e34-b326-ffb47bcf98ba" containerName="extract-utilities" Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.804088 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="963d9e81-5aca-4e34-b326-ffb47bcf98ba" containerName="extract-utilities" Jan 10 16:30:05 crc kubenswrapper[5036]: E0110 16:30:05.804103 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963d9e81-5aca-4e34-b326-ffb47bcf98ba" containerName="registry-server" Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.804112 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="963d9e81-5aca-4e34-b326-ffb47bcf98ba" containerName="registry-server" Jan 10 16:30:05 crc kubenswrapper[5036]: E0110 16:30:05.804131 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963d9e81-5aca-4e34-b326-ffb47bcf98ba" containerName="extract-content" Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.804138 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="963d9e81-5aca-4e34-b326-ffb47bcf98ba" containerName="extract-content" Jan 10 16:30:05 crc kubenswrapper[5036]: E0110 16:30:05.804156 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.804165 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.804438 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="963d9e81-5aca-4e34-b326-ffb47bcf98ba" containerName="registry-server" Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.804473 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffa416ec-eaf9-430f-9ada-2b4dd73c76ca" containerName="collect-profiles" Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.804492 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.804504 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.804524 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.804542 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.804557 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.804572 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.819460 5036 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.825997 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 10 16:30:05 crc kubenswrapper[5036]: I0110 16:30:05.833934 5036 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.83:6443: connect: connection refused" Jan 10 16:30:05 crc kubenswrapper[5036]: E0110 16:30:05.855134 5036 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.83:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.018355 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.018432 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.018833 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.019187 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.019251 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.019358 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.019415 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.019449 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.105117 5036 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.105603 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.120298 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.120744 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.120864 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.120952 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.121034 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.121119 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.121231 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.121310 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.120962 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.120520 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.121079 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.120997 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.121649 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.121021 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.121760 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.121853 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.155871 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 10 16:30:06 crc kubenswrapper[5036]: W0110 16:30:06.178549 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-6343f84e7e415204d3f22dcfa55206f5ed2979ccb5a1c05faf24bcc1336b961c WatchSource:0}: Error finding container 6343f84e7e415204d3f22dcfa55206f5ed2979ccb5a1c05faf24bcc1336b961c: Status 404 returned error can't find the container with id 6343f84e7e415204d3f22dcfa55206f5ed2979ccb5a1c05faf24bcc1336b961c Jan 10 16:30:06 crc kubenswrapper[5036]: E0110 16:30:06.184176 5036 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.83:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18896b929e2da256 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-10 16:30:06.18281839 +0000 UTC m=+128.053053884,LastTimestamp:2026-01-10 16:30:06.18281839 +0000 UTC m=+128.053053884,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.692417 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.694030 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.694755 5036 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="48aa6d8e0f00ddf9a6fdef1b8ae1ee9ff101082f5e7d871c81beaa68344edade" exitCode=0 Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.694793 5036 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6164714519a51fd12d13bbf0c74e2ed910fe7e9fb5fc21b0476fa946fc54c3bb" exitCode=0 Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.694805 5036 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="71912e3bacf35053ffa1c8590378aa9a0c88319533d888b0d191e4bce05ae764" exitCode=0 Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.694816 5036 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f71e2aac540c8ebaf6eca7a56c30aa6f65c2c637c7efdfab7999d74ffc2ecf4e" exitCode=2 Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.694925 5036 scope.go:117] "RemoveContainer" containerID="a1a143c7481f264da37aeab778a53b3ba35fa1c2aa6a5111aa105283a82be44d" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.696832 5036 generic.go:334] "Generic (PLEG): container finished" podID="5896d91a-6760-4d04-86ac-c45a6da0fa45" containerID="423c07e10c4a142c22bf8e51ebcbbd4b2c850c19c60037e6877ee00e43f730b0" exitCode=0 Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.696914 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5896d91a-6760-4d04-86ac-c45a6da0fa45","Type":"ContainerDied","Data":"423c07e10c4a142c22bf8e51ebcbbd4b2c850c19c60037e6877ee00e43f730b0"} Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.697755 5036 status_manager.go:851] "Failed to get status for pod" podUID="5896d91a-6760-4d04-86ac-c45a6da0fa45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.83:6443: connect: connection refused" Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.698798 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"a94e20c953d6c92c845d468d3c6b38753f1ca46a8fe49a2522e718fb4f958e8d"} Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.698834 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"6343f84e7e415204d3f22dcfa55206f5ed2979ccb5a1c05faf24bcc1336b961c"} Jan 10 16:30:06 crc kubenswrapper[5036]: I0110 16:30:06.699387 5036 status_manager.go:851] "Failed to get status for pod" podUID="5896d91a-6760-4d04-86ac-c45a6da0fa45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.83:6443: connect: connection refused" Jan 10 16:30:06 crc kubenswrapper[5036]: E0110 16:30:06.699875 5036 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.83:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 10 16:30:07 crc kubenswrapper[5036]: I0110 16:30:07.707893 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.183187 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.184804 5036 status_manager.go:851] "Failed to get status for pod" podUID="5896d91a-6760-4d04-86ac-c45a6da0fa45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.83:6443: connect: connection refused" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.254596 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5896d91a-6760-4d04-86ac-c45a6da0fa45-kube-api-access\") pod \"5896d91a-6760-4d04-86ac-c45a6da0fa45\" (UID: \"5896d91a-6760-4d04-86ac-c45a6da0fa45\") " Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.254669 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5896d91a-6760-4d04-86ac-c45a6da0fa45-kubelet-dir\") pod \"5896d91a-6760-4d04-86ac-c45a6da0fa45\" (UID: \"5896d91a-6760-4d04-86ac-c45a6da0fa45\") " Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.254725 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5896d91a-6760-4d04-86ac-c45a6da0fa45-var-lock\") pod \"5896d91a-6760-4d04-86ac-c45a6da0fa45\" (UID: \"5896d91a-6760-4d04-86ac-c45a6da0fa45\") " Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.254821 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5896d91a-6760-4d04-86ac-c45a6da0fa45-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5896d91a-6760-4d04-86ac-c45a6da0fa45" (UID: "5896d91a-6760-4d04-86ac-c45a6da0fa45"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.254951 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5896d91a-6760-4d04-86ac-c45a6da0fa45-var-lock" (OuterVolumeSpecName: "var-lock") pod "5896d91a-6760-4d04-86ac-c45a6da0fa45" (UID: "5896d91a-6760-4d04-86ac-c45a6da0fa45"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.255150 5036 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5896d91a-6760-4d04-86ac-c45a6da0fa45-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.255166 5036 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5896d91a-6760-4d04-86ac-c45a6da0fa45-var-lock\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.261462 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5896d91a-6760-4d04-86ac-c45a6da0fa45-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5896d91a-6760-4d04-86ac-c45a6da0fa45" (UID: "5896d91a-6760-4d04-86ac-c45a6da0fa45"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.357019 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5896d91a-6760-4d04-86ac-c45a6da0fa45-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.511189 5036 status_manager.go:851] "Failed to get status for pod" podUID="5896d91a-6760-4d04-86ac-c45a6da0fa45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.83:6443: connect: connection refused" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.659231 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.662136 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.663268 5036 status_manager.go:851] "Failed to get status for pod" podUID="5896d91a-6760-4d04-86ac-c45a6da0fa45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.83:6443: connect: connection refused" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.663999 5036 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.83:6443: connect: connection refused" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.726103 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5896d91a-6760-4d04-86ac-c45a6da0fa45","Type":"ContainerDied","Data":"3f244f611ae4ffa4603009bc2fd908f1522e7c36f47cec9fc37ff3d3fadf2b8b"} Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.726161 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f244f611ae4ffa4603009bc2fd908f1522e7c36f47cec9fc37ff3d3fadf2b8b" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.726209 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.731628 5036 status_manager.go:851] "Failed to get status for pod" podUID="5896d91a-6760-4d04-86ac-c45a6da0fa45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.83:6443: connect: connection refused" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.731886 5036 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.83:6443: connect: connection refused" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.732854 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.733522 5036 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5c3d9b76028a6b1f6b025ecd7227387c6ac179e613bb01e8d8d2947a88be0515" exitCode=0 Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.733598 5036 scope.go:117] "RemoveContainer" containerID="48aa6d8e0f00ddf9a6fdef1b8ae1ee9ff101082f5e7d871c81beaa68344edade" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.733954 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.755010 5036 scope.go:117] "RemoveContainer" containerID="6164714519a51fd12d13bbf0c74e2ed910fe7e9fb5fc21b0476fa946fc54c3bb" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.763233 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.763355 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.763407 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.763445 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.763464 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.763550 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.763741 5036 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.763768 5036 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.763782 5036 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.770071 5036 scope.go:117] "RemoveContainer" containerID="71912e3bacf35053ffa1c8590378aa9a0c88319533d888b0d191e4bce05ae764" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.790717 5036 scope.go:117] "RemoveContainer" containerID="f71e2aac540c8ebaf6eca7a56c30aa6f65c2c637c7efdfab7999d74ffc2ecf4e" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.804604 5036 scope.go:117] "RemoveContainer" containerID="5c3d9b76028a6b1f6b025ecd7227387c6ac179e613bb01e8d8d2947a88be0515" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.823691 5036 scope.go:117] "RemoveContainer" containerID="3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.846917 5036 scope.go:117] "RemoveContainer" containerID="48aa6d8e0f00ddf9a6fdef1b8ae1ee9ff101082f5e7d871c81beaa68344edade" Jan 10 16:30:08 crc kubenswrapper[5036]: E0110 16:30:08.847504 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48aa6d8e0f00ddf9a6fdef1b8ae1ee9ff101082f5e7d871c81beaa68344edade\": container with ID starting with 48aa6d8e0f00ddf9a6fdef1b8ae1ee9ff101082f5e7d871c81beaa68344edade not found: ID does not exist" containerID="48aa6d8e0f00ddf9a6fdef1b8ae1ee9ff101082f5e7d871c81beaa68344edade" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.847555 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48aa6d8e0f00ddf9a6fdef1b8ae1ee9ff101082f5e7d871c81beaa68344edade"} err="failed to get container status \"48aa6d8e0f00ddf9a6fdef1b8ae1ee9ff101082f5e7d871c81beaa68344edade\": rpc error: code = NotFound desc = could not find container \"48aa6d8e0f00ddf9a6fdef1b8ae1ee9ff101082f5e7d871c81beaa68344edade\": container with ID starting with 48aa6d8e0f00ddf9a6fdef1b8ae1ee9ff101082f5e7d871c81beaa68344edade not found: ID does not exist" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.847599 5036 scope.go:117] "RemoveContainer" containerID="6164714519a51fd12d13bbf0c74e2ed910fe7e9fb5fc21b0476fa946fc54c3bb" Jan 10 16:30:08 crc kubenswrapper[5036]: E0110 16:30:08.848262 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6164714519a51fd12d13bbf0c74e2ed910fe7e9fb5fc21b0476fa946fc54c3bb\": container with ID starting with 6164714519a51fd12d13bbf0c74e2ed910fe7e9fb5fc21b0476fa946fc54c3bb not found: ID does not exist" containerID="6164714519a51fd12d13bbf0c74e2ed910fe7e9fb5fc21b0476fa946fc54c3bb" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.848295 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6164714519a51fd12d13bbf0c74e2ed910fe7e9fb5fc21b0476fa946fc54c3bb"} err="failed to get container status \"6164714519a51fd12d13bbf0c74e2ed910fe7e9fb5fc21b0476fa946fc54c3bb\": rpc error: code = NotFound desc = could not find container \"6164714519a51fd12d13bbf0c74e2ed910fe7e9fb5fc21b0476fa946fc54c3bb\": container with ID starting with 6164714519a51fd12d13bbf0c74e2ed910fe7e9fb5fc21b0476fa946fc54c3bb not found: ID does not exist" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.848318 5036 scope.go:117] "RemoveContainer" containerID="71912e3bacf35053ffa1c8590378aa9a0c88319533d888b0d191e4bce05ae764" Jan 10 16:30:08 crc kubenswrapper[5036]: E0110 16:30:08.848758 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71912e3bacf35053ffa1c8590378aa9a0c88319533d888b0d191e4bce05ae764\": container with ID starting with 71912e3bacf35053ffa1c8590378aa9a0c88319533d888b0d191e4bce05ae764 not found: ID does not exist" containerID="71912e3bacf35053ffa1c8590378aa9a0c88319533d888b0d191e4bce05ae764" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.849036 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71912e3bacf35053ffa1c8590378aa9a0c88319533d888b0d191e4bce05ae764"} err="failed to get container status \"71912e3bacf35053ffa1c8590378aa9a0c88319533d888b0d191e4bce05ae764\": rpc error: code = NotFound desc = could not find container \"71912e3bacf35053ffa1c8590378aa9a0c88319533d888b0d191e4bce05ae764\": container with ID starting with 71912e3bacf35053ffa1c8590378aa9a0c88319533d888b0d191e4bce05ae764 not found: ID does not exist" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.849098 5036 scope.go:117] "RemoveContainer" containerID="f71e2aac540c8ebaf6eca7a56c30aa6f65c2c637c7efdfab7999d74ffc2ecf4e" Jan 10 16:30:08 crc kubenswrapper[5036]: E0110 16:30:08.849490 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f71e2aac540c8ebaf6eca7a56c30aa6f65c2c637c7efdfab7999d74ffc2ecf4e\": container with ID starting with f71e2aac540c8ebaf6eca7a56c30aa6f65c2c637c7efdfab7999d74ffc2ecf4e not found: ID does not exist" containerID="f71e2aac540c8ebaf6eca7a56c30aa6f65c2c637c7efdfab7999d74ffc2ecf4e" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.849512 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f71e2aac540c8ebaf6eca7a56c30aa6f65c2c637c7efdfab7999d74ffc2ecf4e"} err="failed to get container status \"f71e2aac540c8ebaf6eca7a56c30aa6f65c2c637c7efdfab7999d74ffc2ecf4e\": rpc error: code = NotFound desc = could not find container \"f71e2aac540c8ebaf6eca7a56c30aa6f65c2c637c7efdfab7999d74ffc2ecf4e\": container with ID starting with f71e2aac540c8ebaf6eca7a56c30aa6f65c2c637c7efdfab7999d74ffc2ecf4e not found: ID does not exist" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.849528 5036 scope.go:117] "RemoveContainer" containerID="5c3d9b76028a6b1f6b025ecd7227387c6ac179e613bb01e8d8d2947a88be0515" Jan 10 16:30:08 crc kubenswrapper[5036]: E0110 16:30:08.850079 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c3d9b76028a6b1f6b025ecd7227387c6ac179e613bb01e8d8d2947a88be0515\": container with ID starting with 5c3d9b76028a6b1f6b025ecd7227387c6ac179e613bb01e8d8d2947a88be0515 not found: ID does not exist" containerID="5c3d9b76028a6b1f6b025ecd7227387c6ac179e613bb01e8d8d2947a88be0515" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.850104 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c3d9b76028a6b1f6b025ecd7227387c6ac179e613bb01e8d8d2947a88be0515"} err="failed to get container status \"5c3d9b76028a6b1f6b025ecd7227387c6ac179e613bb01e8d8d2947a88be0515\": rpc error: code = NotFound desc = could not find container \"5c3d9b76028a6b1f6b025ecd7227387c6ac179e613bb01e8d8d2947a88be0515\": container with ID starting with 5c3d9b76028a6b1f6b025ecd7227387c6ac179e613bb01e8d8d2947a88be0515 not found: ID does not exist" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.850118 5036 scope.go:117] "RemoveContainer" containerID="3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa" Jan 10 16:30:08 crc kubenswrapper[5036]: E0110 16:30:08.850587 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa\": container with ID starting with 3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa not found: ID does not exist" containerID="3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa" Jan 10 16:30:08 crc kubenswrapper[5036]: I0110 16:30:08.850672 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa"} err="failed to get container status \"3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa\": rpc error: code = NotFound desc = could not find container \"3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa\": container with ID starting with 3d6394d48ab10fcf2ad94b99e5cfe77817e75d1321ad59208535ec82228285aa not found: ID does not exist" Jan 10 16:30:09 crc kubenswrapper[5036]: I0110 16:30:09.052622 5036 status_manager.go:851] "Failed to get status for pod" podUID="5896d91a-6760-4d04-86ac-c45a6da0fa45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.83:6443: connect: connection refused" Jan 10 16:30:09 crc kubenswrapper[5036]: I0110 16:30:09.052971 5036 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.83:6443: connect: connection refused" Jan 10 16:30:10 crc kubenswrapper[5036]: I0110 16:30:10.515576 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 10 16:30:10 crc kubenswrapper[5036]: E0110 16:30:10.807927 5036 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.83:6443: connect: connection refused" Jan 10 16:30:10 crc kubenswrapper[5036]: E0110 16:30:10.808248 5036 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.83:6443: connect: connection refused" Jan 10 16:30:10 crc kubenswrapper[5036]: E0110 16:30:10.808520 5036 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.83:6443: connect: connection refused" Jan 10 16:30:10 crc kubenswrapper[5036]: E0110 16:30:10.808757 5036 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.83:6443: connect: connection refused" Jan 10 16:30:10 crc kubenswrapper[5036]: E0110 16:30:10.809219 5036 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.83:6443: connect: connection refused" Jan 10 16:30:10 crc kubenswrapper[5036]: I0110 16:30:10.809253 5036 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 10 16:30:10 crc kubenswrapper[5036]: E0110 16:30:10.809597 5036 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.83:6443: connect: connection refused" interval="200ms" Jan 10 16:30:11 crc kubenswrapper[5036]: E0110 16:30:11.010292 5036 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.83:6443: connect: connection refused" interval="400ms" Jan 10 16:30:11 crc kubenswrapper[5036]: E0110 16:30:11.411292 5036 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.83:6443: connect: connection refused" interval="800ms" Jan 10 16:30:11 crc kubenswrapper[5036]: E0110 16:30:11.965918 5036 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.83:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18896b929e2da256 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-10 16:30:06.18281839 +0000 UTC m=+128.053053884,LastTimestamp:2026-01-10 16:30:06.18281839 +0000 UTC m=+128.053053884,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 10 16:30:12 crc kubenswrapper[5036]: E0110 16:30:12.212515 5036 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.83:6443: connect: connection refused" interval="1.6s" Jan 10 16:30:13 crc kubenswrapper[5036]: E0110 16:30:13.528959 5036 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.83:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" volumeName="registry-storage" Jan 10 16:30:13 crc kubenswrapper[5036]: E0110 16:30:13.814245 5036 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.83:6443: connect: connection refused" interval="3.2s" Jan 10 16:30:17 crc kubenswrapper[5036]: E0110 16:30:17.015959 5036 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.83:6443: connect: connection refused" interval="6.4s" Jan 10 16:30:18 crc kubenswrapper[5036]: I0110 16:30:18.518991 5036 status_manager.go:851] "Failed to get status for pod" podUID="5896d91a-6760-4d04-86ac-c45a6da0fa45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.83:6443: connect: connection refused" Jan 10 16:30:19 crc kubenswrapper[5036]: I0110 16:30:19.508045 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:30:19 crc kubenswrapper[5036]: I0110 16:30:19.509477 5036 status_manager.go:851] "Failed to get status for pod" podUID="5896d91a-6760-4d04-86ac-c45a6da0fa45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.83:6443: connect: connection refused" Jan 10 16:30:19 crc kubenswrapper[5036]: I0110 16:30:19.523058 5036 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e098c043-2e79-4678-bc14-4306571d12df" Jan 10 16:30:19 crc kubenswrapper[5036]: I0110 16:30:19.523091 5036 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e098c043-2e79-4678-bc14-4306571d12df" Jan 10 16:30:19 crc kubenswrapper[5036]: E0110 16:30:19.523599 5036 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.83:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:30:19 crc kubenswrapper[5036]: I0110 16:30:19.524299 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:30:19 crc kubenswrapper[5036]: W0110 16:30:19.558862 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-eef63348d36c35699131451a64c148517cb55bbd366263344a98e4123cfb4373 WatchSource:0}: Error finding container eef63348d36c35699131451a64c148517cb55bbd366263344a98e4123cfb4373: Status 404 returned error can't find the container with id eef63348d36c35699131451a64c148517cb55bbd366263344a98e4123cfb4373 Jan 10 16:30:19 crc kubenswrapper[5036]: I0110 16:30:19.812045 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"eef63348d36c35699131451a64c148517cb55bbd366263344a98e4123cfb4373"} Jan 10 16:30:19 crc kubenswrapper[5036]: I0110 16:30:19.814192 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 10 16:30:19 crc kubenswrapper[5036]: I0110 16:30:19.814257 5036 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="e08b53a3d87683275ba0e4ee4b22dd9929741e17a4e2246e68900bc15ab73dfb" exitCode=1 Jan 10 16:30:19 crc kubenswrapper[5036]: I0110 16:30:19.814305 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"e08b53a3d87683275ba0e4ee4b22dd9929741e17a4e2246e68900bc15ab73dfb"} Jan 10 16:30:19 crc kubenswrapper[5036]: I0110 16:30:19.814931 5036 scope.go:117] "RemoveContainer" containerID="e08b53a3d87683275ba0e4ee4b22dd9929741e17a4e2246e68900bc15ab73dfb" Jan 10 16:30:19 crc kubenswrapper[5036]: I0110 16:30:19.817015 5036 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.83:6443: connect: connection refused" Jan 10 16:30:19 crc kubenswrapper[5036]: I0110 16:30:19.817652 5036 status_manager.go:851] "Failed to get status for pod" podUID="5896d91a-6760-4d04-86ac-c45a6da0fa45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.83:6443: connect: connection refused" Jan 10 16:30:20 crc kubenswrapper[5036]: I0110 16:30:20.824645 5036 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="8f45e6d9c2fd327e237386c62cff9306e83b1a64164d1cf5ffe6dbc8a84c5971" exitCode=0 Jan 10 16:30:20 crc kubenswrapper[5036]: I0110 16:30:20.824810 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"8f45e6d9c2fd327e237386c62cff9306e83b1a64164d1cf5ffe6dbc8a84c5971"} Jan 10 16:30:20 crc kubenswrapper[5036]: I0110 16:30:20.825149 5036 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e098c043-2e79-4678-bc14-4306571d12df" Jan 10 16:30:20 crc kubenswrapper[5036]: I0110 16:30:20.825187 5036 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e098c043-2e79-4678-bc14-4306571d12df" Jan 10 16:30:20 crc kubenswrapper[5036]: I0110 16:30:20.825773 5036 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.83:6443: connect: connection refused" Jan 10 16:30:20 crc kubenswrapper[5036]: E0110 16:30:20.825928 5036 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.83:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:30:20 crc kubenswrapper[5036]: I0110 16:30:20.826358 5036 status_manager.go:851] "Failed to get status for pod" podUID="5896d91a-6760-4d04-86ac-c45a6da0fa45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.83:6443: connect: connection refused" Jan 10 16:30:20 crc kubenswrapper[5036]: I0110 16:30:20.829867 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 10 16:30:20 crc kubenswrapper[5036]: I0110 16:30:20.829937 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2d69b0d6c7763606e7295b7e6241f519a772bf2ec43e381c670a134a96fbbaed"} Jan 10 16:30:20 crc kubenswrapper[5036]: I0110 16:30:20.830819 5036 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.83:6443: connect: connection refused" Jan 10 16:30:20 crc kubenswrapper[5036]: I0110 16:30:20.831183 5036 status_manager.go:851] "Failed to get status for pod" podUID="5896d91a-6760-4d04-86ac-c45a6da0fa45" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.83:6443: connect: connection refused" Jan 10 16:30:21 crc kubenswrapper[5036]: I0110 16:30:21.575874 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 10 16:30:21 crc kubenswrapper[5036]: I0110 16:30:21.839864 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f88efc65d83cbaa0aae8f6c9c7982e48b832150811e040e65db8b0da5c4938b4"} Jan 10 16:30:21 crc kubenswrapper[5036]: I0110 16:30:21.839921 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7f247c609d0b8e303af4fc869a8c9278388e294246150e4ff3a3763e7dc4a3cf"} Jan 10 16:30:22 crc kubenswrapper[5036]: I0110 16:30:22.636664 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 10 16:30:22 crc kubenswrapper[5036]: I0110 16:30:22.654584 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 10 16:30:22 crc kubenswrapper[5036]: I0110 16:30:22.851106 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ef69fb43cc8543c53eab7758493805301a6d95b35710b884f1d0f9e7de2810f0"} Jan 10 16:30:23 crc kubenswrapper[5036]: I0110 16:30:23.861143 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5d81401cdd90719e2d96d5af90f352e38292cb36829c2e9cae00975cf13cbe3c"} Jan 10 16:30:23 crc kubenswrapper[5036]: I0110 16:30:23.861529 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:30:23 crc kubenswrapper[5036]: I0110 16:30:23.861543 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e3f224d6fb8961611b484701576a402bb1ff1c66fa8af21986967c6482ac12aa"} Jan 10 16:30:23 crc kubenswrapper[5036]: I0110 16:30:23.861365 5036 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e098c043-2e79-4678-bc14-4306571d12df" Jan 10 16:30:23 crc kubenswrapper[5036]: I0110 16:30:23.861574 5036 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e098c043-2e79-4678-bc14-4306571d12df" Jan 10 16:30:24 crc kubenswrapper[5036]: I0110 16:30:24.525045 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:30:24 crc kubenswrapper[5036]: I0110 16:30:24.525317 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:30:24 crc kubenswrapper[5036]: I0110 16:30:24.532394 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:30:25 crc kubenswrapper[5036]: I0110 16:30:25.904692 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 16:30:25 crc kubenswrapper[5036]: I0110 16:30:25.904744 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 16:30:26 crc kubenswrapper[5036]: I0110 16:30:26.962486 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" podUID="87b4bb91-70e1-44be-83a9-7b6adced3e51" containerName="oauth-openshift" containerID="cri-o://10a523a150199988fa9c1229061811decbedf09eb3488d0a97eeb5618b2f29f3" gracePeriod=15 Jan 10 16:30:27 crc kubenswrapper[5036]: I0110 16:30:27.886925 5036 generic.go:334] "Generic (PLEG): container finished" podID="87b4bb91-70e1-44be-83a9-7b6adced3e51" containerID="10a523a150199988fa9c1229061811decbedf09eb3488d0a97eeb5618b2f29f3" exitCode=0 Jan 10 16:30:27 crc kubenswrapper[5036]: I0110 16:30:27.886980 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" event={"ID":"87b4bb91-70e1-44be-83a9-7b6adced3e51","Type":"ContainerDied","Data":"10a523a150199988fa9c1229061811decbedf09eb3488d0a97eeb5618b2f29f3"} Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.736174 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.795164 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-user-idp-0-file-data\") pod \"87b4bb91-70e1-44be-83a9-7b6adced3e51\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.795248 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-router-certs\") pod \"87b4bb91-70e1-44be-83a9-7b6adced3e51\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.795278 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-session\") pod \"87b4bb91-70e1-44be-83a9-7b6adced3e51\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.795310 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-trusted-ca-bundle\") pod \"87b4bb91-70e1-44be-83a9-7b6adced3e51\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.795351 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-serving-cert\") pod \"87b4bb91-70e1-44be-83a9-7b6adced3e51\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.795373 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-service-ca\") pod \"87b4bb91-70e1-44be-83a9-7b6adced3e51\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.795397 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-cliconfig\") pod \"87b4bb91-70e1-44be-83a9-7b6adced3e51\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.795426 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/87b4bb91-70e1-44be-83a9-7b6adced3e51-audit-dir\") pod \"87b4bb91-70e1-44be-83a9-7b6adced3e51\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.795450 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-user-template-error\") pod \"87b4bb91-70e1-44be-83a9-7b6adced3e51\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.795512 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-user-template-login\") pod \"87b4bb91-70e1-44be-83a9-7b6adced3e51\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.795536 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/87b4bb91-70e1-44be-83a9-7b6adced3e51-audit-policies\") pod \"87b4bb91-70e1-44be-83a9-7b6adced3e51\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.795551 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-878vn\" (UniqueName: \"kubernetes.io/projected/87b4bb91-70e1-44be-83a9-7b6adced3e51-kube-api-access-878vn\") pod \"87b4bb91-70e1-44be-83a9-7b6adced3e51\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.795580 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-ocp-branding-template\") pod \"87b4bb91-70e1-44be-83a9-7b6adced3e51\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.795628 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-user-template-provider-selection\") pod \"87b4bb91-70e1-44be-83a9-7b6adced3e51\" (UID: \"87b4bb91-70e1-44be-83a9-7b6adced3e51\") " Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.796797 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "87b4bb91-70e1-44be-83a9-7b6adced3e51" (UID: "87b4bb91-70e1-44be-83a9-7b6adced3e51"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.796816 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87b4bb91-70e1-44be-83a9-7b6adced3e51-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "87b4bb91-70e1-44be-83a9-7b6adced3e51" (UID: "87b4bb91-70e1-44be-83a9-7b6adced3e51"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.797192 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "87b4bb91-70e1-44be-83a9-7b6adced3e51" (UID: "87b4bb91-70e1-44be-83a9-7b6adced3e51"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.797356 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87b4bb91-70e1-44be-83a9-7b6adced3e51-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "87b4bb91-70e1-44be-83a9-7b6adced3e51" (UID: "87b4bb91-70e1-44be-83a9-7b6adced3e51"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.797431 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "87b4bb91-70e1-44be-83a9-7b6adced3e51" (UID: "87b4bb91-70e1-44be-83a9-7b6adced3e51"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.803187 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "87b4bb91-70e1-44be-83a9-7b6adced3e51" (UID: "87b4bb91-70e1-44be-83a9-7b6adced3e51"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.803664 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "87b4bb91-70e1-44be-83a9-7b6adced3e51" (UID: "87b4bb91-70e1-44be-83a9-7b6adced3e51"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.804221 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87b4bb91-70e1-44be-83a9-7b6adced3e51-kube-api-access-878vn" (OuterVolumeSpecName: "kube-api-access-878vn") pod "87b4bb91-70e1-44be-83a9-7b6adced3e51" (UID: "87b4bb91-70e1-44be-83a9-7b6adced3e51"). InnerVolumeSpecName "kube-api-access-878vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.805241 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "87b4bb91-70e1-44be-83a9-7b6adced3e51" (UID: "87b4bb91-70e1-44be-83a9-7b6adced3e51"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.805467 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "87b4bb91-70e1-44be-83a9-7b6adced3e51" (UID: "87b4bb91-70e1-44be-83a9-7b6adced3e51"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.805887 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "87b4bb91-70e1-44be-83a9-7b6adced3e51" (UID: "87b4bb91-70e1-44be-83a9-7b6adced3e51"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.806028 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "87b4bb91-70e1-44be-83a9-7b6adced3e51" (UID: "87b4bb91-70e1-44be-83a9-7b6adced3e51"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.806203 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "87b4bb91-70e1-44be-83a9-7b6adced3e51" (UID: "87b4bb91-70e1-44be-83a9-7b6adced3e51"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.809348 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "87b4bb91-70e1-44be-83a9-7b6adced3e51" (UID: "87b4bb91-70e1-44be-83a9-7b6adced3e51"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.878749 5036 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.894855 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" event={"ID":"87b4bb91-70e1-44be-83a9-7b6adced3e51","Type":"ContainerDied","Data":"260a57a9a7d43bed66a5e9c1cff25df5fc91c89a717e588e11e7814384539272"} Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.894918 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c8vvs" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.894954 5036 scope.go:117] "RemoveContainer" containerID="10a523a150199988fa9c1229061811decbedf09eb3488d0a97eeb5618b2f29f3" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.896855 5036 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.896884 5036 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/87b4bb91-70e1-44be-83a9-7b6adced3e51-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.896900 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-878vn\" (UniqueName: \"kubernetes.io/projected/87b4bb91-70e1-44be-83a9-7b6adced3e51-kube-api-access-878vn\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.896917 5036 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.896928 5036 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.896941 5036 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.896952 5036 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.896965 5036 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.896975 5036 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.896985 5036 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.896996 5036 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.897013 5036 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.897024 5036 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/87b4bb91-70e1-44be-83a9-7b6adced3e51-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:28 crc kubenswrapper[5036]: I0110 16:30:28.897034 5036 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/87b4bb91-70e1-44be-83a9-7b6adced3e51-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:29 crc kubenswrapper[5036]: I0110 16:30:29.534907 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:30:29 crc kubenswrapper[5036]: I0110 16:30:29.539054 5036 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="35a56e2c-6fe9-4dac-8862-3761ff5c47e1" Jan 10 16:30:29 crc kubenswrapper[5036]: I0110 16:30:29.903672 5036 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e098c043-2e79-4678-bc14-4306571d12df" Jan 10 16:30:29 crc kubenswrapper[5036]: I0110 16:30:29.903736 5036 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e098c043-2e79-4678-bc14-4306571d12df" Jan 10 16:30:30 crc kubenswrapper[5036]: I0110 16:30:30.911289 5036 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e098c043-2e79-4678-bc14-4306571d12df" Jan 10 16:30:30 crc kubenswrapper[5036]: I0110 16:30:30.911344 5036 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e098c043-2e79-4678-bc14-4306571d12df" Jan 10 16:30:31 crc kubenswrapper[5036]: I0110 16:30:31.580421 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 10 16:30:38 crc kubenswrapper[5036]: I0110 16:30:38.031539 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 10 16:30:38 crc kubenswrapper[5036]: I0110 16:30:38.526645 5036 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="35a56e2c-6fe9-4dac-8862-3761ff5c47e1" Jan 10 16:30:38 crc kubenswrapper[5036]: I0110 16:30:38.851353 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 10 16:30:39 crc kubenswrapper[5036]: I0110 16:30:39.062323 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 10 16:30:39 crc kubenswrapper[5036]: I0110 16:30:39.277888 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 10 16:30:39 crc kubenswrapper[5036]: I0110 16:30:39.329913 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 10 16:30:39 crc kubenswrapper[5036]: I0110 16:30:39.390372 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 10 16:30:39 crc kubenswrapper[5036]: I0110 16:30:39.428859 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 10 16:30:39 crc kubenswrapper[5036]: I0110 16:30:39.477541 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 10 16:30:39 crc kubenswrapper[5036]: I0110 16:30:39.545455 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 10 16:30:40 crc kubenswrapper[5036]: I0110 16:30:40.004619 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 10 16:30:40 crc kubenswrapper[5036]: I0110 16:30:40.021225 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 10 16:30:40 crc kubenswrapper[5036]: I0110 16:30:40.205265 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 10 16:30:40 crc kubenswrapper[5036]: I0110 16:30:40.348981 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 10 16:30:40 crc kubenswrapper[5036]: I0110 16:30:40.520883 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 10 16:30:40 crc kubenswrapper[5036]: I0110 16:30:40.529495 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 10 16:30:40 crc kubenswrapper[5036]: I0110 16:30:40.548114 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 10 16:30:40 crc kubenswrapper[5036]: I0110 16:30:40.577523 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 10 16:30:40 crc kubenswrapper[5036]: I0110 16:30:40.712865 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 10 16:30:40 crc kubenswrapper[5036]: I0110 16:30:40.725461 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 10 16:30:40 crc kubenswrapper[5036]: I0110 16:30:40.833406 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 10 16:30:40 crc kubenswrapper[5036]: I0110 16:30:40.859514 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 10 16:30:40 crc kubenswrapper[5036]: I0110 16:30:40.866324 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 10 16:30:40 crc kubenswrapper[5036]: I0110 16:30:40.882231 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 10 16:30:40 crc kubenswrapper[5036]: I0110 16:30:40.992180 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 10 16:30:41 crc kubenswrapper[5036]: I0110 16:30:41.190639 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 10 16:30:41 crc kubenswrapper[5036]: I0110 16:30:41.194920 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 10 16:30:41 crc kubenswrapper[5036]: I0110 16:30:41.205788 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 10 16:30:41 crc kubenswrapper[5036]: I0110 16:30:41.230136 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 10 16:30:41 crc kubenswrapper[5036]: I0110 16:30:41.295029 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 10 16:30:41 crc kubenswrapper[5036]: I0110 16:30:41.340570 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 10 16:30:41 crc kubenswrapper[5036]: I0110 16:30:41.371992 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 10 16:30:41 crc kubenswrapper[5036]: I0110 16:30:41.493668 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 10 16:30:41 crc kubenswrapper[5036]: I0110 16:30:41.496517 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 10 16:30:41 crc kubenswrapper[5036]: I0110 16:30:41.513970 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 10 16:30:41 crc kubenswrapper[5036]: I0110 16:30:41.532083 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 10 16:30:41 crc kubenswrapper[5036]: I0110 16:30:41.543075 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 10 16:30:41 crc kubenswrapper[5036]: I0110 16:30:41.631774 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 10 16:30:41 crc kubenswrapper[5036]: I0110 16:30:41.700435 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 10 16:30:41 crc kubenswrapper[5036]: I0110 16:30:41.864721 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 10 16:30:41 crc kubenswrapper[5036]: I0110 16:30:41.871557 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 10 16:30:42 crc kubenswrapper[5036]: I0110 16:30:42.031831 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 10 16:30:42 crc kubenswrapper[5036]: I0110 16:30:42.038983 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 10 16:30:42 crc kubenswrapper[5036]: I0110 16:30:42.108740 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 10 16:30:42 crc kubenswrapper[5036]: I0110 16:30:42.281203 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 10 16:30:42 crc kubenswrapper[5036]: I0110 16:30:42.288938 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 10 16:30:42 crc kubenswrapper[5036]: I0110 16:30:42.354398 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 10 16:30:42 crc kubenswrapper[5036]: I0110 16:30:42.436613 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 10 16:30:42 crc kubenswrapper[5036]: I0110 16:30:42.568637 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 10 16:30:42 crc kubenswrapper[5036]: I0110 16:30:42.599654 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 10 16:30:42 crc kubenswrapper[5036]: I0110 16:30:42.658998 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 10 16:30:42 crc kubenswrapper[5036]: I0110 16:30:42.687069 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 10 16:30:42 crc kubenswrapper[5036]: I0110 16:30:42.776492 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 10 16:30:42 crc kubenswrapper[5036]: I0110 16:30:42.798539 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 10 16:30:42 crc kubenswrapper[5036]: I0110 16:30:42.803735 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 10 16:30:42 crc kubenswrapper[5036]: I0110 16:30:42.904584 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 10 16:30:43 crc kubenswrapper[5036]: I0110 16:30:43.102510 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 10 16:30:43 crc kubenswrapper[5036]: I0110 16:30:43.103047 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 10 16:30:43 crc kubenswrapper[5036]: I0110 16:30:43.125951 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 10 16:30:43 crc kubenswrapper[5036]: I0110 16:30:43.181854 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 10 16:30:43 crc kubenswrapper[5036]: I0110 16:30:43.219782 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 10 16:30:43 crc kubenswrapper[5036]: I0110 16:30:43.282105 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 10 16:30:43 crc kubenswrapper[5036]: I0110 16:30:43.301599 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 10 16:30:43 crc kubenswrapper[5036]: I0110 16:30:43.353239 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 10 16:30:43 crc kubenswrapper[5036]: I0110 16:30:43.581192 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 10 16:30:43 crc kubenswrapper[5036]: I0110 16:30:43.659439 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 10 16:30:43 crc kubenswrapper[5036]: I0110 16:30:43.867290 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 10 16:30:43 crc kubenswrapper[5036]: I0110 16:30:43.903295 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 10 16:30:43 crc kubenswrapper[5036]: I0110 16:30:43.934796 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 10 16:30:43 crc kubenswrapper[5036]: I0110 16:30:43.937607 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 10 16:30:43 crc kubenswrapper[5036]: I0110 16:30:43.943578 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 10 16:30:44 crc kubenswrapper[5036]: I0110 16:30:44.079566 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 10 16:30:44 crc kubenswrapper[5036]: I0110 16:30:44.118149 5036 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 10 16:30:44 crc kubenswrapper[5036]: I0110 16:30:44.137617 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 10 16:30:44 crc kubenswrapper[5036]: I0110 16:30:44.180280 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 10 16:30:44 crc kubenswrapper[5036]: I0110 16:30:44.211063 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 10 16:30:44 crc kubenswrapper[5036]: I0110 16:30:44.233786 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 10 16:30:44 crc kubenswrapper[5036]: I0110 16:30:44.240091 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 10 16:30:44 crc kubenswrapper[5036]: I0110 16:30:44.284189 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 10 16:30:44 crc kubenswrapper[5036]: I0110 16:30:44.344517 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 10 16:30:44 crc kubenswrapper[5036]: I0110 16:30:44.456623 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 10 16:30:44 crc kubenswrapper[5036]: I0110 16:30:44.570663 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 10 16:30:44 crc kubenswrapper[5036]: I0110 16:30:44.724378 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 10 16:30:44 crc kubenswrapper[5036]: I0110 16:30:44.766861 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 10 16:30:44 crc kubenswrapper[5036]: I0110 16:30:44.865764 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 10 16:30:44 crc kubenswrapper[5036]: I0110 16:30:44.882853 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 10 16:30:44 crc kubenswrapper[5036]: I0110 16:30:44.983646 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 10 16:30:45 crc kubenswrapper[5036]: I0110 16:30:45.043276 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 10 16:30:45 crc kubenswrapper[5036]: I0110 16:30:45.177029 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 10 16:30:45 crc kubenswrapper[5036]: I0110 16:30:45.179923 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 10 16:30:45 crc kubenswrapper[5036]: I0110 16:30:45.246391 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 10 16:30:45 crc kubenswrapper[5036]: I0110 16:30:45.262061 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 10 16:30:45 crc kubenswrapper[5036]: I0110 16:30:45.262602 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 10 16:30:45 crc kubenswrapper[5036]: I0110 16:30:45.288049 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 10 16:30:45 crc kubenswrapper[5036]: I0110 16:30:45.466416 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 10 16:30:45 crc kubenswrapper[5036]: I0110 16:30:45.497440 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 10 16:30:45 crc kubenswrapper[5036]: I0110 16:30:45.578330 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 10 16:30:45 crc kubenswrapper[5036]: I0110 16:30:45.628321 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 10 16:30:45 crc kubenswrapper[5036]: I0110 16:30:45.650324 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 10 16:30:45 crc kubenswrapper[5036]: I0110 16:30:45.663957 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 10 16:30:45 crc kubenswrapper[5036]: I0110 16:30:45.676980 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 10 16:30:45 crc kubenswrapper[5036]: I0110 16:30:45.679519 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 10 16:30:45 crc kubenswrapper[5036]: I0110 16:30:45.735176 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 10 16:30:45 crc kubenswrapper[5036]: I0110 16:30:45.788035 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 10 16:30:45 crc kubenswrapper[5036]: I0110 16:30:45.922785 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 10 16:30:45 crc kubenswrapper[5036]: I0110 16:30:45.948475 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 10 16:30:45 crc kubenswrapper[5036]: I0110 16:30:45.978902 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 10 16:30:46 crc kubenswrapper[5036]: I0110 16:30:46.037243 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 10 16:30:46 crc kubenswrapper[5036]: I0110 16:30:46.092121 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 10 16:30:46 crc kubenswrapper[5036]: I0110 16:30:46.118278 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 10 16:30:46 crc kubenswrapper[5036]: I0110 16:30:46.178357 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 10 16:30:46 crc kubenswrapper[5036]: I0110 16:30:46.193820 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 10 16:30:46 crc kubenswrapper[5036]: I0110 16:30:46.195447 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 10 16:30:46 crc kubenswrapper[5036]: I0110 16:30:46.256947 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 10 16:30:46 crc kubenswrapper[5036]: I0110 16:30:46.266097 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 10 16:30:46 crc kubenswrapper[5036]: I0110 16:30:46.281813 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 10 16:30:46 crc kubenswrapper[5036]: I0110 16:30:46.357207 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 10 16:30:46 crc kubenswrapper[5036]: I0110 16:30:46.366603 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 10 16:30:46 crc kubenswrapper[5036]: I0110 16:30:46.503281 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 10 16:30:46 crc kubenswrapper[5036]: I0110 16:30:46.536523 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 10 16:30:46 crc kubenswrapper[5036]: I0110 16:30:46.566058 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 10 16:30:46 crc kubenswrapper[5036]: I0110 16:30:46.639919 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 10 16:30:46 crc kubenswrapper[5036]: I0110 16:30:46.687901 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 10 16:30:46 crc kubenswrapper[5036]: I0110 16:30:46.725630 5036 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 10 16:30:46 crc kubenswrapper[5036]: I0110 16:30:46.827876 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 10 16:30:46 crc kubenswrapper[5036]: I0110 16:30:46.867457 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 10 16:30:46 crc kubenswrapper[5036]: I0110 16:30:46.928368 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.010378 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.018970 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.036522 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.151120 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.274515 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.296026 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.313372 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.320365 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.331556 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.375504 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.460363 5036 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.467715 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-c8vvs"] Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.467802 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7489ccbc46-wzq6m","openshift-kube-apiserver/kube-apiserver-crc"] Jan 10 16:30:47 crc kubenswrapper[5036]: E0110 16:30:47.468083 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87b4bb91-70e1-44be-83a9-7b6adced3e51" containerName="oauth-openshift" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.468111 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="87b4bb91-70e1-44be-83a9-7b6adced3e51" containerName="oauth-openshift" Jan 10 16:30:47 crc kubenswrapper[5036]: E0110 16:30:47.468141 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5896d91a-6760-4d04-86ac-c45a6da0fa45" containerName="installer" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.468155 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="5896d91a-6760-4d04-86ac-c45a6da0fa45" containerName="installer" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.468330 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="5896d91a-6760-4d04-86ac-c45a6da0fa45" containerName="installer" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.468378 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="87b4bb91-70e1-44be-83a9-7b6adced3e51" containerName="oauth-openshift" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.468418 5036 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e098c043-2e79-4678-bc14-4306571d12df" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.468452 5036 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e098c043-2e79-4678-bc14-4306571d12df" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.469021 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.472238 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.473001 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.473305 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.473958 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.474091 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.475383 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.476361 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.477020 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.477087 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.477362 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.477382 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.477576 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.478145 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.488389 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.489529 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.500652 5036 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.515387 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.516180 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=19.516157518 podStartE2EDuration="19.516157518s" podCreationTimestamp="2026-01-10 16:30:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:30:47.497995312 +0000 UTC m=+169.368230866" watchObservedRunningTime="2026-01-10 16:30:47.516157518 +0000 UTC m=+169.386393022" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.567857 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.568821 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.584615 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-system-service-ca\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.584717 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.585292 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.585480 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzj7k\" (UniqueName: \"kubernetes.io/projected/ba300922-7634-4db7-b5d2-6787bfc325e5-kube-api-access-zzj7k\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.585524 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.585585 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.585628 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ba300922-7634-4db7-b5d2-6787bfc325e5-audit-policies\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.585655 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.585689 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.585713 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-user-template-login\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.585761 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-system-router-certs\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.585817 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-system-session\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.585899 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ba300922-7634-4db7-b5d2-6787bfc325e5-audit-dir\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.585943 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-user-template-error\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.613409 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.616048 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.618535 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.659391 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.687906 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzj7k\" (UniqueName: \"kubernetes.io/projected/ba300922-7634-4db7-b5d2-6787bfc325e5-kube-api-access-zzj7k\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.687984 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.688053 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ba300922-7634-4db7-b5d2-6787bfc325e5-audit-policies\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.688093 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.688131 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.688172 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.688220 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-user-template-login\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.688261 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-system-router-certs\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.688302 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-system-session\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.688342 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ba300922-7634-4db7-b5d2-6787bfc325e5-audit-dir\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.688390 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-user-template-error\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.688445 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-system-service-ca\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.688479 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.688513 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.688748 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ba300922-7634-4db7-b5d2-6787bfc325e5-audit-dir\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.689939 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ba300922-7634-4db7-b5d2-6787bfc325e5-audit-policies\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.690137 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-system-service-ca\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.690248 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.691651 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.695101 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-user-template-error\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.696000 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.696011 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.698597 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.698543 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-system-router-certs\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.700968 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-user-template-login\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.702384 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-system-session\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.702445 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ba300922-7634-4db7-b5d2-6787bfc325e5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.710710 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzj7k\" (UniqueName: \"kubernetes.io/projected/ba300922-7634-4db7-b5d2-6787bfc325e5-kube-api-access-zzj7k\") pod \"oauth-openshift-7489ccbc46-wzq6m\" (UID: \"ba300922-7634-4db7-b5d2-6787bfc325e5\") " pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.726241 5036 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.748198 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.804949 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.854252 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 10 16:30:47 crc kubenswrapper[5036]: I0110 16:30:47.917021 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 10 16:30:48 crc kubenswrapper[5036]: I0110 16:30:48.033450 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 10 16:30:48 crc kubenswrapper[5036]: I0110 16:30:48.043964 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 10 16:30:48 crc kubenswrapper[5036]: I0110 16:30:48.045195 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 10 16:30:48 crc kubenswrapper[5036]: I0110 16:30:48.145935 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 10 16:30:48 crc kubenswrapper[5036]: I0110 16:30:48.148319 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 10 16:30:48 crc kubenswrapper[5036]: I0110 16:30:48.150380 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 10 16:30:48 crc kubenswrapper[5036]: I0110 16:30:48.163731 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 10 16:30:48 crc kubenswrapper[5036]: I0110 16:30:48.180081 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 10 16:30:48 crc kubenswrapper[5036]: I0110 16:30:48.197839 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 10 16:30:48 crc kubenswrapper[5036]: I0110 16:30:48.267291 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7489ccbc46-wzq6m"] Jan 10 16:30:48 crc kubenswrapper[5036]: I0110 16:30:48.268253 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 10 16:30:48 crc kubenswrapper[5036]: I0110 16:30:48.351539 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 10 16:30:48 crc kubenswrapper[5036]: I0110 16:30:48.353361 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 10 16:30:48 crc kubenswrapper[5036]: I0110 16:30:48.517695 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87b4bb91-70e1-44be-83a9-7b6adced3e51" path="/var/lib/kubelet/pods/87b4bb91-70e1-44be-83a9-7b6adced3e51/volumes" Jan 10 16:30:48 crc kubenswrapper[5036]: I0110 16:30:48.575830 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 10 16:30:48 crc kubenswrapper[5036]: I0110 16:30:48.616049 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 10 16:30:48 crc kubenswrapper[5036]: I0110 16:30:48.649764 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 10 16:30:48 crc kubenswrapper[5036]: I0110 16:30:48.788145 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 10 16:30:48 crc kubenswrapper[5036]: I0110 16:30:48.865541 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 10 16:30:48 crc kubenswrapper[5036]: I0110 16:30:48.872006 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 10 16:30:48 crc kubenswrapper[5036]: I0110 16:30:48.892631 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 10 16:30:48 crc kubenswrapper[5036]: I0110 16:30:48.898990 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 10 16:30:48 crc kubenswrapper[5036]: I0110 16:30:48.923420 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 10 16:30:49 crc kubenswrapper[5036]: I0110 16:30:49.023744 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-7489ccbc46-wzq6m_ba300922-7634-4db7-b5d2-6787bfc325e5/oauth-openshift/0.log" Jan 10 16:30:49 crc kubenswrapper[5036]: I0110 16:30:49.023785 5036 generic.go:334] "Generic (PLEG): container finished" podID="ba300922-7634-4db7-b5d2-6787bfc325e5" containerID="4565ccc69a4d0271d67ea61445f44a62918fa141dc82acba0286e0c8bdd0c2ab" exitCode=255 Jan 10 16:30:49 crc kubenswrapper[5036]: I0110 16:30:49.024758 5036 scope.go:117] "RemoveContainer" containerID="4565ccc69a4d0271d67ea61445f44a62918fa141dc82acba0286e0c8bdd0c2ab" Jan 10 16:30:49 crc kubenswrapper[5036]: I0110 16:30:49.024827 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" event={"ID":"ba300922-7634-4db7-b5d2-6787bfc325e5","Type":"ContainerDied","Data":"4565ccc69a4d0271d67ea61445f44a62918fa141dc82acba0286e0c8bdd0c2ab"} Jan 10 16:30:49 crc kubenswrapper[5036]: I0110 16:30:49.024879 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" event={"ID":"ba300922-7634-4db7-b5d2-6787bfc325e5","Type":"ContainerStarted","Data":"e9ead8b6f60416724b28ec87b720b873d7692b9d2b737ad03f623149afa2f4d9"} Jan 10 16:30:49 crc kubenswrapper[5036]: I0110 16:30:49.138417 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 10 16:30:49 crc kubenswrapper[5036]: I0110 16:30:49.244647 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 10 16:30:49 crc kubenswrapper[5036]: I0110 16:30:49.248031 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 10 16:30:49 crc kubenswrapper[5036]: I0110 16:30:49.285216 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 10 16:30:49 crc kubenswrapper[5036]: I0110 16:30:49.315422 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 10 16:30:49 crc kubenswrapper[5036]: I0110 16:30:49.347510 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 10 16:30:49 crc kubenswrapper[5036]: I0110 16:30:49.463060 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 10 16:30:49 crc kubenswrapper[5036]: I0110 16:30:49.465354 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 10 16:30:49 crc kubenswrapper[5036]: I0110 16:30:49.562120 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 10 16:30:49 crc kubenswrapper[5036]: I0110 16:30:49.634761 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 10 16:30:49 crc kubenswrapper[5036]: I0110 16:30:49.720203 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 10 16:30:49 crc kubenswrapper[5036]: I0110 16:30:49.726804 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 10 16:30:49 crc kubenswrapper[5036]: I0110 16:30:49.758040 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 10 16:30:49 crc kubenswrapper[5036]: I0110 16:30:49.923645 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 10 16:30:49 crc kubenswrapper[5036]: I0110 16:30:49.930017 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 10 16:30:49 crc kubenswrapper[5036]: I0110 16:30:49.941038 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 10 16:30:49 crc kubenswrapper[5036]: I0110 16:30:49.983457 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 10 16:30:49 crc kubenswrapper[5036]: I0110 16:30:49.984261 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.004643 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.034571 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-7489ccbc46-wzq6m_ba300922-7634-4db7-b5d2-6787bfc325e5/oauth-openshift/1.log" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.035510 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-7489ccbc46-wzq6m_ba300922-7634-4db7-b5d2-6787bfc325e5/oauth-openshift/0.log" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.035664 5036 generic.go:334] "Generic (PLEG): container finished" podID="ba300922-7634-4db7-b5d2-6787bfc325e5" containerID="222bc7e908ed395c5757e8e41e5bc50e510c6b504685dcbe3b3d6844031d826d" exitCode=255 Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.035802 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" event={"ID":"ba300922-7634-4db7-b5d2-6787bfc325e5","Type":"ContainerDied","Data":"222bc7e908ed395c5757e8e41e5bc50e510c6b504685dcbe3b3d6844031d826d"} Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.036040 5036 scope.go:117] "RemoveContainer" containerID="4565ccc69a4d0271d67ea61445f44a62918fa141dc82acba0286e0c8bdd0c2ab" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.036655 5036 scope.go:117] "RemoveContainer" containerID="222bc7e908ed395c5757e8e41e5bc50e510c6b504685dcbe3b3d6844031d826d" Jan 10 16:30:50 crc kubenswrapper[5036]: E0110 16:30:50.037077 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-7489ccbc46-wzq6m_openshift-authentication(ba300922-7634-4db7-b5d2-6787bfc325e5)\"" pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" podUID="ba300922-7634-4db7-b5d2-6787bfc325e5" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.155834 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.310297 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.311815 5036 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.320940 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.324608 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.339450 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.391759 5036 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.392306 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://a94e20c953d6c92c845d468d3c6b38753f1ca46a8fe49a2522e718fb4f958e8d" gracePeriod=5 Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.415884 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.442052 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.515978 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.530302 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.573536 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.583897 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.589523 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.617260 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.673825 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.688215 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.754737 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.816354 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.952367 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 10 16:30:50 crc kubenswrapper[5036]: I0110 16:30:50.954465 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 10 16:30:51 crc kubenswrapper[5036]: I0110 16:30:51.043516 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-7489ccbc46-wzq6m_ba300922-7634-4db7-b5d2-6787bfc325e5/oauth-openshift/1.log" Jan 10 16:30:51 crc kubenswrapper[5036]: I0110 16:30:51.044592 5036 scope.go:117] "RemoveContainer" containerID="222bc7e908ed395c5757e8e41e5bc50e510c6b504685dcbe3b3d6844031d826d" Jan 10 16:30:51 crc kubenswrapper[5036]: E0110 16:30:51.045198 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-7489ccbc46-wzq6m_openshift-authentication(ba300922-7634-4db7-b5d2-6787bfc325e5)\"" pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" podUID="ba300922-7634-4db7-b5d2-6787bfc325e5" Jan 10 16:30:51 crc kubenswrapper[5036]: I0110 16:30:51.163536 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 10 16:30:51 crc kubenswrapper[5036]: I0110 16:30:51.281974 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 10 16:30:51 crc kubenswrapper[5036]: I0110 16:30:51.378752 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 10 16:30:51 crc kubenswrapper[5036]: I0110 16:30:51.393410 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 10 16:30:51 crc kubenswrapper[5036]: I0110 16:30:51.474172 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 10 16:30:51 crc kubenswrapper[5036]: I0110 16:30:51.503516 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 10 16:30:51 crc kubenswrapper[5036]: I0110 16:30:51.559096 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 10 16:30:51 crc kubenswrapper[5036]: I0110 16:30:51.592162 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 10 16:30:51 crc kubenswrapper[5036]: I0110 16:30:51.614580 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 10 16:30:51 crc kubenswrapper[5036]: I0110 16:30:51.633972 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 10 16:30:51 crc kubenswrapper[5036]: I0110 16:30:51.648122 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 10 16:30:51 crc kubenswrapper[5036]: I0110 16:30:51.695025 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 10 16:30:51 crc kubenswrapper[5036]: I0110 16:30:51.719053 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 10 16:30:51 crc kubenswrapper[5036]: I0110 16:30:51.861599 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 10 16:30:51 crc kubenswrapper[5036]: I0110 16:30:51.933067 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 10 16:30:51 crc kubenswrapper[5036]: I0110 16:30:51.956308 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 10 16:30:52 crc kubenswrapper[5036]: I0110 16:30:52.170947 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 10 16:30:52 crc kubenswrapper[5036]: I0110 16:30:52.354778 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 10 16:30:52 crc kubenswrapper[5036]: I0110 16:30:52.698669 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 10 16:30:52 crc kubenswrapper[5036]: I0110 16:30:52.747788 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 10 16:30:52 crc kubenswrapper[5036]: I0110 16:30:52.906449 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 10 16:30:53 crc kubenswrapper[5036]: I0110 16:30:53.217539 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 10 16:30:53 crc kubenswrapper[5036]: I0110 16:30:53.346783 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 10 16:30:53 crc kubenswrapper[5036]: I0110 16:30:53.540874 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 10 16:30:53 crc kubenswrapper[5036]: I0110 16:30:53.613981 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 10 16:30:53 crc kubenswrapper[5036]: I0110 16:30:53.920788 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 10 16:30:54 crc kubenswrapper[5036]: I0110 16:30:54.140630 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 10 16:30:55 crc kubenswrapper[5036]: I0110 16:30:55.574533 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c5dff577d-96tm8"] Jan 10 16:30:55 crc kubenswrapper[5036]: I0110 16:30:55.575441 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" podUID="e7165c35-9b98-4950-bd64-a6cdb0454463" containerName="controller-manager" containerID="cri-o://d4c1a31617a2e80a8e6151e973246c8b82778a5371b775a7afcd70097d2b5b00" gracePeriod=30 Jan 10 16:30:55 crc kubenswrapper[5036]: I0110 16:30:55.586044 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9"] Jan 10 16:30:55 crc kubenswrapper[5036]: I0110 16:30:55.586329 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9" podUID="78b8ad26-c462-4fed-bb9d-98bf88363c35" containerName="route-controller-manager" containerID="cri-o://3249e76be5524c02b154cc9cb5b0dde24ea2ed4a2e4e3ae7eeebc6c0679cea0b" gracePeriod=30 Jan 10 16:30:55 crc kubenswrapper[5036]: I0110 16:30:55.904506 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 16:30:55 crc kubenswrapper[5036]: I0110 16:30:55.904593 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 16:30:55 crc kubenswrapper[5036]: I0110 16:30:55.989316 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 10 16:30:55 crc kubenswrapper[5036]: I0110 16:30:55.989839 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.032247 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.079134 5036 generic.go:334] "Generic (PLEG): container finished" podID="e7165c35-9b98-4950-bd64-a6cdb0454463" containerID="d4c1a31617a2e80a8e6151e973246c8b82778a5371b775a7afcd70097d2b5b00" exitCode=0 Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.079216 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" event={"ID":"e7165c35-9b98-4950-bd64-a6cdb0454463","Type":"ContainerDied","Data":"d4c1a31617a2e80a8e6151e973246c8b82778a5371b775a7afcd70097d2b5b00"} Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.079284 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" event={"ID":"e7165c35-9b98-4950-bd64-a6cdb0454463","Type":"ContainerDied","Data":"61efbbbe47135e0b27729e943745d147db98a383f0a4f94b11c1233783c87265"} Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.079297 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61efbbbe47135e0b27729e943745d147db98a383f0a4f94b11c1233783c87265" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.080362 5036 generic.go:334] "Generic (PLEG): container finished" podID="78b8ad26-c462-4fed-bb9d-98bf88363c35" containerID="3249e76be5524c02b154cc9cb5b0dde24ea2ed4a2e4e3ae7eeebc6c0679cea0b" exitCode=0 Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.080411 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9" event={"ID":"78b8ad26-c462-4fed-bb9d-98bf88363c35","Type":"ContainerDied","Data":"3249e76be5524c02b154cc9cb5b0dde24ea2ed4a2e4e3ae7eeebc6c0679cea0b"} Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.080428 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9" event={"ID":"78b8ad26-c462-4fed-bb9d-98bf88363c35","Type":"ContainerDied","Data":"cff03df4e6249e960742ea3ea50a86047daec36f922a2fdf6dab41d1628791b6"} Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.080446 5036 scope.go:117] "RemoveContainer" containerID="3249e76be5524c02b154cc9cb5b0dde24ea2ed4a2e4e3ae7eeebc6c0679cea0b" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.080554 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.082988 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.083022 5036 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="a94e20c953d6c92c845d468d3c6b38753f1ca46a8fe49a2522e718fb4f958e8d" exitCode=137 Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.083084 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.104781 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.104963 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.104912 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.105102 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.105066 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.105495 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.105160 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.106254 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.106306 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78b8ad26-c462-4fed-bb9d-98bf88363c35-config\") pod \"78b8ad26-c462-4fed-bb9d-98bf88363c35\" (UID: \"78b8ad26-c462-4fed-bb9d-98bf88363c35\") " Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.106489 5036 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.106515 5036 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.106527 5036 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.106802 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.107471 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78b8ad26-c462-4fed-bb9d-98bf88363c35-config" (OuterVolumeSpecName: "config") pod "78b8ad26-c462-4fed-bb9d-98bf88363c35" (UID: "78b8ad26-c462-4fed-bb9d-98bf88363c35"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.115433 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.121434 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.123266 5036 scope.go:117] "RemoveContainer" containerID="3249e76be5524c02b154cc9cb5b0dde24ea2ed4a2e4e3ae7eeebc6c0679cea0b" Jan 10 16:30:56 crc kubenswrapper[5036]: E0110 16:30:56.123889 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3249e76be5524c02b154cc9cb5b0dde24ea2ed4a2e4e3ae7eeebc6c0679cea0b\": container with ID starting with 3249e76be5524c02b154cc9cb5b0dde24ea2ed4a2e4e3ae7eeebc6c0679cea0b not found: ID does not exist" containerID="3249e76be5524c02b154cc9cb5b0dde24ea2ed4a2e4e3ae7eeebc6c0679cea0b" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.123927 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3249e76be5524c02b154cc9cb5b0dde24ea2ed4a2e4e3ae7eeebc6c0679cea0b"} err="failed to get container status \"3249e76be5524c02b154cc9cb5b0dde24ea2ed4a2e4e3ae7eeebc6c0679cea0b\": rpc error: code = NotFound desc = could not find container \"3249e76be5524c02b154cc9cb5b0dde24ea2ed4a2e4e3ae7eeebc6c0679cea0b\": container with ID starting with 3249e76be5524c02b154cc9cb5b0dde24ea2ed4a2e4e3ae7eeebc6c0679cea0b not found: ID does not exist" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.123964 5036 scope.go:117] "RemoveContainer" containerID="a94e20c953d6c92c845d468d3c6b38753f1ca46a8fe49a2522e718fb4f958e8d" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.149482 5036 scope.go:117] "RemoveContainer" containerID="a94e20c953d6c92c845d468d3c6b38753f1ca46a8fe49a2522e718fb4f958e8d" Jan 10 16:30:56 crc kubenswrapper[5036]: E0110 16:30:56.150240 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a94e20c953d6c92c845d468d3c6b38753f1ca46a8fe49a2522e718fb4f958e8d\": container with ID starting with a94e20c953d6c92c845d468d3c6b38753f1ca46a8fe49a2522e718fb4f958e8d not found: ID does not exist" containerID="a94e20c953d6c92c845d468d3c6b38753f1ca46a8fe49a2522e718fb4f958e8d" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.150289 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a94e20c953d6c92c845d468d3c6b38753f1ca46a8fe49a2522e718fb4f958e8d"} err="failed to get container status \"a94e20c953d6c92c845d468d3c6b38753f1ca46a8fe49a2522e718fb4f958e8d\": rpc error: code = NotFound desc = could not find container \"a94e20c953d6c92c845d468d3c6b38753f1ca46a8fe49a2522e718fb4f958e8d\": container with ID starting with a94e20c953d6c92c845d468d3c6b38753f1ca46a8fe49a2522e718fb4f958e8d not found: ID does not exist" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.207344 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7165c35-9b98-4950-bd64-a6cdb0454463-client-ca\") pod \"e7165c35-9b98-4950-bd64-a6cdb0454463\" (UID: \"e7165c35-9b98-4950-bd64-a6cdb0454463\") " Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.207402 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78b8ad26-c462-4fed-bb9d-98bf88363c35-client-ca\") pod \"78b8ad26-c462-4fed-bb9d-98bf88363c35\" (UID: \"78b8ad26-c462-4fed-bb9d-98bf88363c35\") " Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.207436 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7165c35-9b98-4950-bd64-a6cdb0454463-config\") pod \"e7165c35-9b98-4950-bd64-a6cdb0454463\" (UID: \"e7165c35-9b98-4950-bd64-a6cdb0454463\") " Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.207459 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78b8ad26-c462-4fed-bb9d-98bf88363c35-serving-cert\") pod \"78b8ad26-c462-4fed-bb9d-98bf88363c35\" (UID: \"78b8ad26-c462-4fed-bb9d-98bf88363c35\") " Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.207492 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7165c35-9b98-4950-bd64-a6cdb0454463-serving-cert\") pod \"e7165c35-9b98-4950-bd64-a6cdb0454463\" (UID: \"e7165c35-9b98-4950-bd64-a6cdb0454463\") " Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.207526 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lrg5\" (UniqueName: \"kubernetes.io/projected/e7165c35-9b98-4950-bd64-a6cdb0454463-kube-api-access-7lrg5\") pod \"e7165c35-9b98-4950-bd64-a6cdb0454463\" (UID: \"e7165c35-9b98-4950-bd64-a6cdb0454463\") " Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.207584 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7165c35-9b98-4950-bd64-a6cdb0454463-proxy-ca-bundles\") pod \"e7165c35-9b98-4950-bd64-a6cdb0454463\" (UID: \"e7165c35-9b98-4950-bd64-a6cdb0454463\") " Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.207607 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nd9b9\" (UniqueName: \"kubernetes.io/projected/78b8ad26-c462-4fed-bb9d-98bf88363c35-kube-api-access-nd9b9\") pod \"78b8ad26-c462-4fed-bb9d-98bf88363c35\" (UID: \"78b8ad26-c462-4fed-bb9d-98bf88363c35\") " Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.207800 5036 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.207816 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78b8ad26-c462-4fed-bb9d-98bf88363c35-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.207827 5036 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.208162 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78b8ad26-c462-4fed-bb9d-98bf88363c35-client-ca" (OuterVolumeSpecName: "client-ca") pod "78b8ad26-c462-4fed-bb9d-98bf88363c35" (UID: "78b8ad26-c462-4fed-bb9d-98bf88363c35"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.208470 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7165c35-9b98-4950-bd64-a6cdb0454463-client-ca" (OuterVolumeSpecName: "client-ca") pod "e7165c35-9b98-4950-bd64-a6cdb0454463" (UID: "e7165c35-9b98-4950-bd64-a6cdb0454463"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.209069 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7165c35-9b98-4950-bd64-a6cdb0454463-config" (OuterVolumeSpecName: "config") pod "e7165c35-9b98-4950-bd64-a6cdb0454463" (UID: "e7165c35-9b98-4950-bd64-a6cdb0454463"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.209147 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7165c35-9b98-4950-bd64-a6cdb0454463-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e7165c35-9b98-4950-bd64-a6cdb0454463" (UID: "e7165c35-9b98-4950-bd64-a6cdb0454463"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.211202 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b8ad26-c462-4fed-bb9d-98bf88363c35-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "78b8ad26-c462-4fed-bb9d-98bf88363c35" (UID: "78b8ad26-c462-4fed-bb9d-98bf88363c35"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.211735 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78b8ad26-c462-4fed-bb9d-98bf88363c35-kube-api-access-nd9b9" (OuterVolumeSpecName: "kube-api-access-nd9b9") pod "78b8ad26-c462-4fed-bb9d-98bf88363c35" (UID: "78b8ad26-c462-4fed-bb9d-98bf88363c35"). InnerVolumeSpecName "kube-api-access-nd9b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.211886 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7165c35-9b98-4950-bd64-a6cdb0454463-kube-api-access-7lrg5" (OuterVolumeSpecName: "kube-api-access-7lrg5") pod "e7165c35-9b98-4950-bd64-a6cdb0454463" (UID: "e7165c35-9b98-4950-bd64-a6cdb0454463"). InnerVolumeSpecName "kube-api-access-7lrg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.212516 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7165c35-9b98-4950-bd64-a6cdb0454463-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7165c35-9b98-4950-bd64-a6cdb0454463" (UID: "e7165c35-9b98-4950-bd64-a6cdb0454463"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.309539 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7165c35-9b98-4950-bd64-a6cdb0454463-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.309600 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lrg5\" (UniqueName: \"kubernetes.io/projected/e7165c35-9b98-4950-bd64-a6cdb0454463-kube-api-access-7lrg5\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.309612 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nd9b9\" (UniqueName: \"kubernetes.io/projected/78b8ad26-c462-4fed-bb9d-98bf88363c35-kube-api-access-nd9b9\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.309621 5036 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e7165c35-9b98-4950-bd64-a6cdb0454463-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.309631 5036 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e7165c35-9b98-4950-bd64-a6cdb0454463-client-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.309640 5036 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78b8ad26-c462-4fed-bb9d-98bf88363c35-client-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.309650 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7165c35-9b98-4950-bd64-a6cdb0454463-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.309659 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78b8ad26-c462-4fed-bb9d-98bf88363c35-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.423622 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9"] Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.431380 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d4b946969-gjjz9"] Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.518022 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78b8ad26-c462-4fed-bb9d-98bf88363c35" path="/var/lib/kubelet/pods/78b8ad26-c462-4fed-bb9d-98bf88363c35/volumes" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.518447 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.855006 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-774c5dd755-2q2bk"] Jan 10 16:30:56 crc kubenswrapper[5036]: E0110 16:30:56.855560 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.855591 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 10 16:30:56 crc kubenswrapper[5036]: E0110 16:30:56.855617 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7165c35-9b98-4950-bd64-a6cdb0454463" containerName="controller-manager" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.855632 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7165c35-9b98-4950-bd64-a6cdb0454463" containerName="controller-manager" Jan 10 16:30:56 crc kubenswrapper[5036]: E0110 16:30:56.855647 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b8ad26-c462-4fed-bb9d-98bf88363c35" containerName="route-controller-manager" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.855660 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b8ad26-c462-4fed-bb9d-98bf88363c35" containerName="route-controller-manager" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.855885 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.855916 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b8ad26-c462-4fed-bb9d-98bf88363c35" containerName="route-controller-manager" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.855947 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7165c35-9b98-4950-bd64-a6cdb0454463" containerName="controller-manager" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.856803 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.860807 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb"] Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.861839 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.864182 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.864509 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.864729 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.864988 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.865225 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.865489 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.865620 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb"] Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.870527 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-774c5dd755-2q2bk"] Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.916339 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/706cc053-7669-4fef-b78a-fd5b5223a7f7-serving-cert\") pod \"controller-manager-774c5dd755-2q2bk\" (UID: \"706cc053-7669-4fef-b78a-fd5b5223a7f7\") " pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.916388 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19-config\") pod \"route-controller-manager-85d4fcb4f-lztwb\" (UID: \"7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19\") " pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.916416 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19-serving-cert\") pod \"route-controller-manager-85d4fcb4f-lztwb\" (UID: \"7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19\") " pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.916592 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/706cc053-7669-4fef-b78a-fd5b5223a7f7-proxy-ca-bundles\") pod \"controller-manager-774c5dd755-2q2bk\" (UID: \"706cc053-7669-4fef-b78a-fd5b5223a7f7\") " pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.916726 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg96p\" (UniqueName: \"kubernetes.io/projected/706cc053-7669-4fef-b78a-fd5b5223a7f7-kube-api-access-gg96p\") pod \"controller-manager-774c5dd755-2q2bk\" (UID: \"706cc053-7669-4fef-b78a-fd5b5223a7f7\") " pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.916786 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/706cc053-7669-4fef-b78a-fd5b5223a7f7-client-ca\") pod \"controller-manager-774c5dd755-2q2bk\" (UID: \"706cc053-7669-4fef-b78a-fd5b5223a7f7\") " pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.916841 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19-client-ca\") pod \"route-controller-manager-85d4fcb4f-lztwb\" (UID: \"7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19\") " pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.916889 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r6dx\" (UniqueName: \"kubernetes.io/projected/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19-kube-api-access-4r6dx\") pod \"route-controller-manager-85d4fcb4f-lztwb\" (UID: \"7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19\") " pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb" Jan 10 16:30:56 crc kubenswrapper[5036]: I0110 16:30:56.916920 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/706cc053-7669-4fef-b78a-fd5b5223a7f7-config\") pod \"controller-manager-774c5dd755-2q2bk\" (UID: \"706cc053-7669-4fef-b78a-fd5b5223a7f7\") " pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.018052 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/706cc053-7669-4fef-b78a-fd5b5223a7f7-serving-cert\") pod \"controller-manager-774c5dd755-2q2bk\" (UID: \"706cc053-7669-4fef-b78a-fd5b5223a7f7\") " pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.018103 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19-config\") pod \"route-controller-manager-85d4fcb4f-lztwb\" (UID: \"7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19\") " pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb" Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.018126 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19-serving-cert\") pod \"route-controller-manager-85d4fcb4f-lztwb\" (UID: \"7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19\") " pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb" Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.018155 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/706cc053-7669-4fef-b78a-fd5b5223a7f7-proxy-ca-bundles\") pod \"controller-manager-774c5dd755-2q2bk\" (UID: \"706cc053-7669-4fef-b78a-fd5b5223a7f7\") " pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.018175 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg96p\" (UniqueName: \"kubernetes.io/projected/706cc053-7669-4fef-b78a-fd5b5223a7f7-kube-api-access-gg96p\") pod \"controller-manager-774c5dd755-2q2bk\" (UID: \"706cc053-7669-4fef-b78a-fd5b5223a7f7\") " pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.018193 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/706cc053-7669-4fef-b78a-fd5b5223a7f7-client-ca\") pod \"controller-manager-774c5dd755-2q2bk\" (UID: \"706cc053-7669-4fef-b78a-fd5b5223a7f7\") " pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.018211 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19-client-ca\") pod \"route-controller-manager-85d4fcb4f-lztwb\" (UID: \"7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19\") " pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb" Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.018237 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r6dx\" (UniqueName: \"kubernetes.io/projected/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19-kube-api-access-4r6dx\") pod \"route-controller-manager-85d4fcb4f-lztwb\" (UID: \"7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19\") " pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb" Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.018260 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/706cc053-7669-4fef-b78a-fd5b5223a7f7-config\") pod \"controller-manager-774c5dd755-2q2bk\" (UID: \"706cc053-7669-4fef-b78a-fd5b5223a7f7\") " pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.019544 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/706cc053-7669-4fef-b78a-fd5b5223a7f7-config\") pod \"controller-manager-774c5dd755-2q2bk\" (UID: \"706cc053-7669-4fef-b78a-fd5b5223a7f7\") " pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.019840 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19-client-ca\") pod \"route-controller-manager-85d4fcb4f-lztwb\" (UID: \"7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19\") " pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb" Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.020799 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/706cc053-7669-4fef-b78a-fd5b5223a7f7-client-ca\") pod \"controller-manager-774c5dd755-2q2bk\" (UID: \"706cc053-7669-4fef-b78a-fd5b5223a7f7\") " pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.021489 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/706cc053-7669-4fef-b78a-fd5b5223a7f7-proxy-ca-bundles\") pod \"controller-manager-774c5dd755-2q2bk\" (UID: \"706cc053-7669-4fef-b78a-fd5b5223a7f7\") " pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.021787 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19-config\") pod \"route-controller-manager-85d4fcb4f-lztwb\" (UID: \"7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19\") " pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb" Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.027502 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/706cc053-7669-4fef-b78a-fd5b5223a7f7-serving-cert\") pod \"controller-manager-774c5dd755-2q2bk\" (UID: \"706cc053-7669-4fef-b78a-fd5b5223a7f7\") " pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.032141 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19-serving-cert\") pod \"route-controller-manager-85d4fcb4f-lztwb\" (UID: \"7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19\") " pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb" Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.037776 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg96p\" (UniqueName: \"kubernetes.io/projected/706cc053-7669-4fef-b78a-fd5b5223a7f7-kube-api-access-gg96p\") pod \"controller-manager-774c5dd755-2q2bk\" (UID: \"706cc053-7669-4fef-b78a-fd5b5223a7f7\") " pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.039804 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r6dx\" (UniqueName: \"kubernetes.io/projected/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19-kube-api-access-4r6dx\") pod \"route-controller-manager-85d4fcb4f-lztwb\" (UID: \"7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19\") " pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb" Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.090963 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c5dff577d-96tm8" Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.111162 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c5dff577d-96tm8"] Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.127654 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5c5dff577d-96tm8"] Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.191771 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.199839 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb" Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.414332 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-774c5dd755-2q2bk"] Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.491085 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb"] Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.805807 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.805881 5036 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:30:57 crc kubenswrapper[5036]: I0110 16:30:57.806734 5036 scope.go:117] "RemoveContainer" containerID="222bc7e908ed395c5757e8e41e5bc50e510c6b504685dcbe3b3d6844031d826d" Jan 10 16:30:57 crc kubenswrapper[5036]: E0110 16:30:57.807109 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-7489ccbc46-wzq6m_openshift-authentication(ba300922-7634-4db7-b5d2-6787bfc325e5)\"" pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" podUID="ba300922-7634-4db7-b5d2-6787bfc325e5" Jan 10 16:30:58 crc kubenswrapper[5036]: I0110 16:30:58.098809 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb" event={"ID":"7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19","Type":"ContainerStarted","Data":"76dfb8ac30f5be17b2dcf1b0daef3e6bb9b08f7e6dbd17cb4559f43204fdd1aa"} Jan 10 16:30:58 crc kubenswrapper[5036]: I0110 16:30:58.098861 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb" event={"ID":"7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19","Type":"ContainerStarted","Data":"4327f8eccba687523786cf50b4e90d20899747383005019f2073725a77c50631"} Jan 10 16:30:58 crc kubenswrapper[5036]: I0110 16:30:58.099046 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb" Jan 10 16:30:58 crc kubenswrapper[5036]: I0110 16:30:58.099655 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" event={"ID":"706cc053-7669-4fef-b78a-fd5b5223a7f7","Type":"ContainerStarted","Data":"16ce06906c3e2bfbf3fb1d6e9bd0d8044ad580ea218e782d6a38df478092dbd7"} Jan 10 16:30:58 crc kubenswrapper[5036]: I0110 16:30:58.099674 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" event={"ID":"706cc053-7669-4fef-b78a-fd5b5223a7f7","Type":"ContainerStarted","Data":"ce03e5ed5c9fa04fbc55d9200603d9d380ddc016cd13f781ca7084cb18c19348"} Jan 10 16:30:58 crc kubenswrapper[5036]: I0110 16:30:58.100408 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" Jan 10 16:30:58 crc kubenswrapper[5036]: I0110 16:30:58.104830 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" Jan 10 16:30:58 crc kubenswrapper[5036]: I0110 16:30:58.104936 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb" Jan 10 16:30:58 crc kubenswrapper[5036]: I0110 16:30:58.118883 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb" podStartSLOduration=3.118862202 podStartE2EDuration="3.118862202s" podCreationTimestamp="2026-01-10 16:30:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:30:58.116240713 +0000 UTC m=+179.986476207" watchObservedRunningTime="2026-01-10 16:30:58.118862202 +0000 UTC m=+179.989097706" Jan 10 16:30:58 crc kubenswrapper[5036]: I0110 16:30:58.154857 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" podStartSLOduration=3.154823052 podStartE2EDuration="3.154823052s" podCreationTimestamp="2026-01-10 16:30:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:30:58.136543732 +0000 UTC m=+180.006779226" watchObservedRunningTime="2026-01-10 16:30:58.154823052 +0000 UTC m=+180.025058546" Jan 10 16:30:58 crc kubenswrapper[5036]: I0110 16:30:58.519309 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7165c35-9b98-4950-bd64-a6cdb0454463" path="/var/lib/kubelet/pods/e7165c35-9b98-4950-bd64-a6cdb0454463/volumes" Jan 10 16:31:11 crc kubenswrapper[5036]: I0110 16:31:11.508637 5036 scope.go:117] "RemoveContainer" containerID="222bc7e908ed395c5757e8e41e5bc50e510c6b504685dcbe3b3d6844031d826d" Jan 10 16:31:12 crc kubenswrapper[5036]: I0110 16:31:12.189763 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-7489ccbc46-wzq6m_ba300922-7634-4db7-b5d2-6787bfc325e5/oauth-openshift/1.log" Jan 10 16:31:12 crc kubenswrapper[5036]: I0110 16:31:12.189903 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" event={"ID":"ba300922-7634-4db7-b5d2-6787bfc325e5","Type":"ContainerStarted","Data":"2bec617136752f5dd6632187bf1c7e419e5e9c5c8dbd88403632c48bb422097b"} Jan 10 16:31:12 crc kubenswrapper[5036]: I0110 16:31:12.190332 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:31:12 crc kubenswrapper[5036]: I0110 16:31:12.193264 5036 generic.go:334] "Generic (PLEG): container finished" podID="a0bc40ca-fd61-4885-871b-3a7964df225a" containerID="6b87c309cac13c46edddfa5e185e84be2b2002e281913a2591d10279b59e7b6d" exitCode=0 Jan 10 16:31:12 crc kubenswrapper[5036]: I0110 16:31:12.193325 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" event={"ID":"a0bc40ca-fd61-4885-871b-3a7964df225a","Type":"ContainerDied","Data":"6b87c309cac13c46edddfa5e185e84be2b2002e281913a2591d10279b59e7b6d"} Jan 10 16:31:12 crc kubenswrapper[5036]: I0110 16:31:12.193998 5036 scope.go:117] "RemoveContainer" containerID="6b87c309cac13c46edddfa5e185e84be2b2002e281913a2591d10279b59e7b6d" Jan 10 16:31:12 crc kubenswrapper[5036]: I0110 16:31:12.219499 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" podStartSLOduration=71.21947273 podStartE2EDuration="1m11.21947273s" podCreationTimestamp="2026-01-10 16:30:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:31:12.212777203 +0000 UTC m=+194.083012707" watchObservedRunningTime="2026-01-10 16:31:12.21947273 +0000 UTC m=+194.089708244" Jan 10 16:31:12 crc kubenswrapper[5036]: I0110 16:31:12.293181 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7489ccbc46-wzq6m" Jan 10 16:31:13 crc kubenswrapper[5036]: I0110 16:31:13.208174 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" event={"ID":"a0bc40ca-fd61-4885-871b-3a7964df225a","Type":"ContainerStarted","Data":"9e0cdb67eefdcd3a260e5d5a239677bf20e445084cc794c54479b98cfebc17f8"} Jan 10 16:31:13 crc kubenswrapper[5036]: I0110 16:31:13.209056 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" Jan 10 16:31:13 crc kubenswrapper[5036]: I0110 16:31:13.213062 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" Jan 10 16:31:15 crc kubenswrapper[5036]: I0110 16:31:15.567807 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-774c5dd755-2q2bk"] Jan 10 16:31:15 crc kubenswrapper[5036]: I0110 16:31:15.568118 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" podUID="706cc053-7669-4fef-b78a-fd5b5223a7f7" containerName="controller-manager" containerID="cri-o://16ce06906c3e2bfbf3fb1d6e9bd0d8044ad580ea218e782d6a38df478092dbd7" gracePeriod=30 Jan 10 16:31:15 crc kubenswrapper[5036]: I0110 16:31:15.578074 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb"] Jan 10 16:31:15 crc kubenswrapper[5036]: I0110 16:31:15.578673 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb" podUID="7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19" containerName="route-controller-manager" containerID="cri-o://76dfb8ac30f5be17b2dcf1b0daef3e6bb9b08f7e6dbd17cb4559f43204fdd1aa" gracePeriod=30 Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.132954 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.204159 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.228820 5036 generic.go:334] "Generic (PLEG): container finished" podID="7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19" containerID="76dfb8ac30f5be17b2dcf1b0daef3e6bb9b08f7e6dbd17cb4559f43204fdd1aa" exitCode=0 Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.228906 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.228930 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb" event={"ID":"7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19","Type":"ContainerDied","Data":"76dfb8ac30f5be17b2dcf1b0daef3e6bb9b08f7e6dbd17cb4559f43204fdd1aa"} Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.229214 5036 scope.go:117] "RemoveContainer" containerID="76dfb8ac30f5be17b2dcf1b0daef3e6bb9b08f7e6dbd17cb4559f43204fdd1aa" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.229158 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb" event={"ID":"7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19","Type":"ContainerDied","Data":"4327f8eccba687523786cf50b4e90d20899747383005019f2073725a77c50631"} Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.233561 5036 generic.go:334] "Generic (PLEG): container finished" podID="706cc053-7669-4fef-b78a-fd5b5223a7f7" containerID="16ce06906c3e2bfbf3fb1d6e9bd0d8044ad580ea218e782d6a38df478092dbd7" exitCode=0 Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.233610 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" event={"ID":"706cc053-7669-4fef-b78a-fd5b5223a7f7","Type":"ContainerDied","Data":"16ce06906c3e2bfbf3fb1d6e9bd0d8044ad580ea218e782d6a38df478092dbd7"} Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.233650 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" event={"ID":"706cc053-7669-4fef-b78a-fd5b5223a7f7","Type":"ContainerDied","Data":"ce03e5ed5c9fa04fbc55d9200603d9d380ddc016cd13f781ca7084cb18c19348"} Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.233730 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774c5dd755-2q2bk" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.237184 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19-serving-cert\") pod \"7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19\" (UID: \"7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19\") " Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.237294 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19-config\") pod \"7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19\" (UID: \"7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19\") " Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.237417 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19-client-ca\") pod \"7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19\" (UID: \"7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19\") " Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.237468 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r6dx\" (UniqueName: \"kubernetes.io/projected/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19-kube-api-access-4r6dx\") pod \"7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19\" (UID: \"7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19\") " Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.238482 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19-client-ca" (OuterVolumeSpecName: "client-ca") pod "7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19" (UID: "7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.238535 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19-config" (OuterVolumeSpecName: "config") pod "7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19" (UID: "7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.245989 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19" (UID: "7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.247817 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19-kube-api-access-4r6dx" (OuterVolumeSpecName: "kube-api-access-4r6dx") pod "7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19" (UID: "7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19"). InnerVolumeSpecName "kube-api-access-4r6dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.257500 5036 scope.go:117] "RemoveContainer" containerID="76dfb8ac30f5be17b2dcf1b0daef3e6bb9b08f7e6dbd17cb4559f43204fdd1aa" Jan 10 16:31:16 crc kubenswrapper[5036]: E0110 16:31:16.258145 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76dfb8ac30f5be17b2dcf1b0daef3e6bb9b08f7e6dbd17cb4559f43204fdd1aa\": container with ID starting with 76dfb8ac30f5be17b2dcf1b0daef3e6bb9b08f7e6dbd17cb4559f43204fdd1aa not found: ID does not exist" containerID="76dfb8ac30f5be17b2dcf1b0daef3e6bb9b08f7e6dbd17cb4559f43204fdd1aa" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.258275 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76dfb8ac30f5be17b2dcf1b0daef3e6bb9b08f7e6dbd17cb4559f43204fdd1aa"} err="failed to get container status \"76dfb8ac30f5be17b2dcf1b0daef3e6bb9b08f7e6dbd17cb4559f43204fdd1aa\": rpc error: code = NotFound desc = could not find container \"76dfb8ac30f5be17b2dcf1b0daef3e6bb9b08f7e6dbd17cb4559f43204fdd1aa\": container with ID starting with 76dfb8ac30f5be17b2dcf1b0daef3e6bb9b08f7e6dbd17cb4559f43204fdd1aa not found: ID does not exist" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.258374 5036 scope.go:117] "RemoveContainer" containerID="16ce06906c3e2bfbf3fb1d6e9bd0d8044ad580ea218e782d6a38df478092dbd7" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.279155 5036 scope.go:117] "RemoveContainer" containerID="16ce06906c3e2bfbf3fb1d6e9bd0d8044ad580ea218e782d6a38df478092dbd7" Jan 10 16:31:16 crc kubenswrapper[5036]: E0110 16:31:16.279911 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16ce06906c3e2bfbf3fb1d6e9bd0d8044ad580ea218e782d6a38df478092dbd7\": container with ID starting with 16ce06906c3e2bfbf3fb1d6e9bd0d8044ad580ea218e782d6a38df478092dbd7 not found: ID does not exist" containerID="16ce06906c3e2bfbf3fb1d6e9bd0d8044ad580ea218e782d6a38df478092dbd7" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.280023 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ce06906c3e2bfbf3fb1d6e9bd0d8044ad580ea218e782d6a38df478092dbd7"} err="failed to get container status \"16ce06906c3e2bfbf3fb1d6e9bd0d8044ad580ea218e782d6a38df478092dbd7\": rpc error: code = NotFound desc = could not find container \"16ce06906c3e2bfbf3fb1d6e9bd0d8044ad580ea218e782d6a38df478092dbd7\": container with ID starting with 16ce06906c3e2bfbf3fb1d6e9bd0d8044ad580ea218e782d6a38df478092dbd7 not found: ID does not exist" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.338860 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/706cc053-7669-4fef-b78a-fd5b5223a7f7-serving-cert\") pod \"706cc053-7669-4fef-b78a-fd5b5223a7f7\" (UID: \"706cc053-7669-4fef-b78a-fd5b5223a7f7\") " Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.339740 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg96p\" (UniqueName: \"kubernetes.io/projected/706cc053-7669-4fef-b78a-fd5b5223a7f7-kube-api-access-gg96p\") pod \"706cc053-7669-4fef-b78a-fd5b5223a7f7\" (UID: \"706cc053-7669-4fef-b78a-fd5b5223a7f7\") " Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.339958 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/706cc053-7669-4fef-b78a-fd5b5223a7f7-proxy-ca-bundles\") pod \"706cc053-7669-4fef-b78a-fd5b5223a7f7\" (UID: \"706cc053-7669-4fef-b78a-fd5b5223a7f7\") " Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.340096 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/706cc053-7669-4fef-b78a-fd5b5223a7f7-config\") pod \"706cc053-7669-4fef-b78a-fd5b5223a7f7\" (UID: \"706cc053-7669-4fef-b78a-fd5b5223a7f7\") " Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.340194 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/706cc053-7669-4fef-b78a-fd5b5223a7f7-client-ca\") pod \"706cc053-7669-4fef-b78a-fd5b5223a7f7\" (UID: \"706cc053-7669-4fef-b78a-fd5b5223a7f7\") " Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.340569 5036 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19-client-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.340655 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r6dx\" (UniqueName: \"kubernetes.io/projected/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19-kube-api-access-4r6dx\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.340779 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.340892 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.340718 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/706cc053-7669-4fef-b78a-fd5b5223a7f7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "706cc053-7669-4fef-b78a-fd5b5223a7f7" (UID: "706cc053-7669-4fef-b78a-fd5b5223a7f7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.340905 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/706cc053-7669-4fef-b78a-fd5b5223a7f7-config" (OuterVolumeSpecName: "config") pod "706cc053-7669-4fef-b78a-fd5b5223a7f7" (UID: "706cc053-7669-4fef-b78a-fd5b5223a7f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.340926 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/706cc053-7669-4fef-b78a-fd5b5223a7f7-client-ca" (OuterVolumeSpecName: "client-ca") pod "706cc053-7669-4fef-b78a-fd5b5223a7f7" (UID: "706cc053-7669-4fef-b78a-fd5b5223a7f7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.343863 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/706cc053-7669-4fef-b78a-fd5b5223a7f7-kube-api-access-gg96p" (OuterVolumeSpecName: "kube-api-access-gg96p") pod "706cc053-7669-4fef-b78a-fd5b5223a7f7" (UID: "706cc053-7669-4fef-b78a-fd5b5223a7f7"). InnerVolumeSpecName "kube-api-access-gg96p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.343894 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706cc053-7669-4fef-b78a-fd5b5223a7f7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "706cc053-7669-4fef-b78a-fd5b5223a7f7" (UID: "706cc053-7669-4fef-b78a-fd5b5223a7f7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.442914 5036 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/706cc053-7669-4fef-b78a-fd5b5223a7f7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.442973 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/706cc053-7669-4fef-b78a-fd5b5223a7f7-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.442994 5036 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/706cc053-7669-4fef-b78a-fd5b5223a7f7-client-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.443012 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/706cc053-7669-4fef-b78a-fd5b5223a7f7-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.443032 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg96p\" (UniqueName: \"kubernetes.io/projected/706cc053-7669-4fef-b78a-fd5b5223a7f7-kube-api-access-gg96p\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.571944 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb"] Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.577497 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85d4fcb4f-lztwb"] Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.589695 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-774c5dd755-2q2bk"] Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.594199 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-774c5dd755-2q2bk"] Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.869844 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz"] Jan 10 16:31:16 crc kubenswrapper[5036]: E0110 16:31:16.870150 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="706cc053-7669-4fef-b78a-fd5b5223a7f7" containerName="controller-manager" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.870164 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="706cc053-7669-4fef-b78a-fd5b5223a7f7" containerName="controller-manager" Jan 10 16:31:16 crc kubenswrapper[5036]: E0110 16:31:16.870174 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19" containerName="route-controller-manager" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.870181 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19" containerName="route-controller-manager" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.870318 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19" containerName="route-controller-manager" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.870330 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="706cc053-7669-4fef-b78a-fd5b5223a7f7" containerName="controller-manager" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.870950 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.875621 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.876086 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.876337 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.876543 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.876643 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.876849 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.886823 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-78b59b8d64-vnmf7"] Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.888420 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.891205 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz"] Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.894486 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.895085 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.895335 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.895542 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.897644 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.897983 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78b59b8d64-vnmf7"] Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.899022 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 10 16:31:16 crc kubenswrapper[5036]: I0110 16:31:16.914188 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.055950 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swzwm\" (UniqueName: \"kubernetes.io/projected/587a465d-a421-4b2c-834f-9640d82b1a6f-kube-api-access-swzwm\") pod \"controller-manager-78b59b8d64-vnmf7\" (UID: \"587a465d-a421-4b2c-834f-9640d82b1a6f\") " pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.056027 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkgrh\" (UniqueName: \"kubernetes.io/projected/c2f5ae2e-140f-4209-91fd-ea887aa999c3-kube-api-access-jkgrh\") pod \"route-controller-manager-5c6bcbf77d-2g2nz\" (UID: \"c2f5ae2e-140f-4209-91fd-ea887aa999c3\") " pod="openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.056055 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/587a465d-a421-4b2c-834f-9640d82b1a6f-client-ca\") pod \"controller-manager-78b59b8d64-vnmf7\" (UID: \"587a465d-a421-4b2c-834f-9640d82b1a6f\") " pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.056218 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2f5ae2e-140f-4209-91fd-ea887aa999c3-config\") pod \"route-controller-manager-5c6bcbf77d-2g2nz\" (UID: \"c2f5ae2e-140f-4209-91fd-ea887aa999c3\") " pod="openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.056320 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/587a465d-a421-4b2c-834f-9640d82b1a6f-serving-cert\") pod \"controller-manager-78b59b8d64-vnmf7\" (UID: \"587a465d-a421-4b2c-834f-9640d82b1a6f\") " pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.056392 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2f5ae2e-140f-4209-91fd-ea887aa999c3-serving-cert\") pod \"route-controller-manager-5c6bcbf77d-2g2nz\" (UID: \"c2f5ae2e-140f-4209-91fd-ea887aa999c3\") " pod="openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.056438 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/587a465d-a421-4b2c-834f-9640d82b1a6f-config\") pod \"controller-manager-78b59b8d64-vnmf7\" (UID: \"587a465d-a421-4b2c-834f-9640d82b1a6f\") " pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.056568 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/587a465d-a421-4b2c-834f-9640d82b1a6f-proxy-ca-bundles\") pod \"controller-manager-78b59b8d64-vnmf7\" (UID: \"587a465d-a421-4b2c-834f-9640d82b1a6f\") " pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.056653 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2f5ae2e-140f-4209-91fd-ea887aa999c3-client-ca\") pod \"route-controller-manager-5c6bcbf77d-2g2nz\" (UID: \"c2f5ae2e-140f-4209-91fd-ea887aa999c3\") " pod="openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.158242 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swzwm\" (UniqueName: \"kubernetes.io/projected/587a465d-a421-4b2c-834f-9640d82b1a6f-kube-api-access-swzwm\") pod \"controller-manager-78b59b8d64-vnmf7\" (UID: \"587a465d-a421-4b2c-834f-9640d82b1a6f\") " pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.158344 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkgrh\" (UniqueName: \"kubernetes.io/projected/c2f5ae2e-140f-4209-91fd-ea887aa999c3-kube-api-access-jkgrh\") pod \"route-controller-manager-5c6bcbf77d-2g2nz\" (UID: \"c2f5ae2e-140f-4209-91fd-ea887aa999c3\") " pod="openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.158381 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/587a465d-a421-4b2c-834f-9640d82b1a6f-client-ca\") pod \"controller-manager-78b59b8d64-vnmf7\" (UID: \"587a465d-a421-4b2c-834f-9640d82b1a6f\") " pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.158417 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2f5ae2e-140f-4209-91fd-ea887aa999c3-config\") pod \"route-controller-manager-5c6bcbf77d-2g2nz\" (UID: \"c2f5ae2e-140f-4209-91fd-ea887aa999c3\") " pod="openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.158445 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/587a465d-a421-4b2c-834f-9640d82b1a6f-serving-cert\") pod \"controller-manager-78b59b8d64-vnmf7\" (UID: \"587a465d-a421-4b2c-834f-9640d82b1a6f\") " pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.158471 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/587a465d-a421-4b2c-834f-9640d82b1a6f-config\") pod \"controller-manager-78b59b8d64-vnmf7\" (UID: \"587a465d-a421-4b2c-834f-9640d82b1a6f\") " pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.158491 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2f5ae2e-140f-4209-91fd-ea887aa999c3-serving-cert\") pod \"route-controller-manager-5c6bcbf77d-2g2nz\" (UID: \"c2f5ae2e-140f-4209-91fd-ea887aa999c3\") " pod="openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.158518 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/587a465d-a421-4b2c-834f-9640d82b1a6f-proxy-ca-bundles\") pod \"controller-manager-78b59b8d64-vnmf7\" (UID: \"587a465d-a421-4b2c-834f-9640d82b1a6f\") " pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.158549 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2f5ae2e-140f-4209-91fd-ea887aa999c3-client-ca\") pod \"route-controller-manager-5c6bcbf77d-2g2nz\" (UID: \"c2f5ae2e-140f-4209-91fd-ea887aa999c3\") " pod="openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.160287 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/587a465d-a421-4b2c-834f-9640d82b1a6f-proxy-ca-bundles\") pod \"controller-manager-78b59b8d64-vnmf7\" (UID: \"587a465d-a421-4b2c-834f-9640d82b1a6f\") " pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.160388 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2f5ae2e-140f-4209-91fd-ea887aa999c3-config\") pod \"route-controller-manager-5c6bcbf77d-2g2nz\" (UID: \"c2f5ae2e-140f-4209-91fd-ea887aa999c3\") " pod="openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.161027 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/587a465d-a421-4b2c-834f-9640d82b1a6f-config\") pod \"controller-manager-78b59b8d64-vnmf7\" (UID: \"587a465d-a421-4b2c-834f-9640d82b1a6f\") " pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.161125 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2f5ae2e-140f-4209-91fd-ea887aa999c3-client-ca\") pod \"route-controller-manager-5c6bcbf77d-2g2nz\" (UID: \"c2f5ae2e-140f-4209-91fd-ea887aa999c3\") " pod="openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.161918 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/587a465d-a421-4b2c-834f-9640d82b1a6f-client-ca\") pod \"controller-manager-78b59b8d64-vnmf7\" (UID: \"587a465d-a421-4b2c-834f-9640d82b1a6f\") " pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.164535 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/587a465d-a421-4b2c-834f-9640d82b1a6f-serving-cert\") pod \"controller-manager-78b59b8d64-vnmf7\" (UID: \"587a465d-a421-4b2c-834f-9640d82b1a6f\") " pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.170633 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2f5ae2e-140f-4209-91fd-ea887aa999c3-serving-cert\") pod \"route-controller-manager-5c6bcbf77d-2g2nz\" (UID: \"c2f5ae2e-140f-4209-91fd-ea887aa999c3\") " pod="openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.183247 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swzwm\" (UniqueName: \"kubernetes.io/projected/587a465d-a421-4b2c-834f-9640d82b1a6f-kube-api-access-swzwm\") pod \"controller-manager-78b59b8d64-vnmf7\" (UID: \"587a465d-a421-4b2c-834f-9640d82b1a6f\") " pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.184995 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkgrh\" (UniqueName: \"kubernetes.io/projected/c2f5ae2e-140f-4209-91fd-ea887aa999c3-kube-api-access-jkgrh\") pod \"route-controller-manager-5c6bcbf77d-2g2nz\" (UID: \"c2f5ae2e-140f-4209-91fd-ea887aa999c3\") " pod="openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.216599 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.225347 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.477926 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78b59b8d64-vnmf7"] Jan 10 16:31:17 crc kubenswrapper[5036]: I0110 16:31:17.764613 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz"] Jan 10 16:31:17 crc kubenswrapper[5036]: W0110 16:31:17.772766 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2f5ae2e_140f_4209_91fd_ea887aa999c3.slice/crio-5886857464e25327312d6be807485cd014b45833664653794aa4416048c2b1b8 WatchSource:0}: Error finding container 5886857464e25327312d6be807485cd014b45833664653794aa4416048c2b1b8: Status 404 returned error can't find the container with id 5886857464e25327312d6be807485cd014b45833664653794aa4416048c2b1b8 Jan 10 16:31:18 crc kubenswrapper[5036]: I0110 16:31:18.256407 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" event={"ID":"587a465d-a421-4b2c-834f-9640d82b1a6f","Type":"ContainerStarted","Data":"57da2a26c88ab17c84ee530c269548bc00745fde4fad8e450622e6bf6a76f96b"} Jan 10 16:31:18 crc kubenswrapper[5036]: I0110 16:31:18.256929 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" event={"ID":"587a465d-a421-4b2c-834f-9640d82b1a6f","Type":"ContainerStarted","Data":"537630ad32e06212bbbb2ebdda8070dc8cb01b2ed0dfcf32757e43794ca36b38"} Jan 10 16:31:18 crc kubenswrapper[5036]: I0110 16:31:18.258648 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz" event={"ID":"c2f5ae2e-140f-4209-91fd-ea887aa999c3","Type":"ContainerStarted","Data":"14e6c8c424350661902e0f47696af20492dffc36efb24cf76c37e314de469bed"} Jan 10 16:31:18 crc kubenswrapper[5036]: I0110 16:31:18.258719 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz" event={"ID":"c2f5ae2e-140f-4209-91fd-ea887aa999c3","Type":"ContainerStarted","Data":"5886857464e25327312d6be807485cd014b45833664653794aa4416048c2b1b8"} Jan 10 16:31:18 crc kubenswrapper[5036]: I0110 16:31:18.259002 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz" Jan 10 16:31:18 crc kubenswrapper[5036]: I0110 16:31:18.279957 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" podStartSLOduration=3.279936484 podStartE2EDuration="3.279936484s" podCreationTimestamp="2026-01-10 16:31:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:31:18.27546417 +0000 UTC m=+200.145699704" watchObservedRunningTime="2026-01-10 16:31:18.279936484 +0000 UTC m=+200.150171978" Jan 10 16:31:18 crc kubenswrapper[5036]: I0110 16:31:18.298617 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz" podStartSLOduration=3.298586361 podStartE2EDuration="3.298586361s" podCreationTimestamp="2026-01-10 16:31:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:31:18.29150656 +0000 UTC m=+200.161742054" watchObservedRunningTime="2026-01-10 16:31:18.298586361 +0000 UTC m=+200.168821885" Jan 10 16:31:18 crc kubenswrapper[5036]: I0110 16:31:18.457900 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz" Jan 10 16:31:18 crc kubenswrapper[5036]: I0110 16:31:18.516350 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="706cc053-7669-4fef-b78a-fd5b5223a7f7" path="/var/lib/kubelet/pods/706cc053-7669-4fef-b78a-fd5b5223a7f7/volumes" Jan 10 16:31:18 crc kubenswrapper[5036]: I0110 16:31:18.517547 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19" path="/var/lib/kubelet/pods/7d1b97a6-5cfa-4fa3-9ca1-14cd6bc21b19/volumes" Jan 10 16:31:19 crc kubenswrapper[5036]: I0110 16:31:19.264466 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" Jan 10 16:31:19 crc kubenswrapper[5036]: I0110 16:31:19.270301 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" Jan 10 16:31:19 crc kubenswrapper[5036]: I0110 16:31:19.572051 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.210716 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hm8ns"] Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.212013 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hm8ns" podUID="1513baef-e92c-4399-ae0f-b8fe4a738702" containerName="registry-server" containerID="cri-o://66bf421f94760e55957470aa75bef64d0a5e3876c85a0cb5563735e8e71a03e0" gracePeriod=30 Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.218149 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9w55c"] Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.218485 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9w55c" podUID="0239b380-03c8-455e-a981-2aaaae000828" containerName="registry-server" containerID="cri-o://6f4053a3a05ce1f148133370d38c55b4aef276f6896d87d35d0ee0ceab03a1b0" gracePeriod=30 Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.244820 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8k59b"] Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.245579 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" podUID="a0bc40ca-fd61-4885-871b-3a7964df225a" containerName="marketplace-operator" containerID="cri-o://9e0cdb67eefdcd3a260e5d5a239677bf20e445084cc794c54479b98cfebc17f8" gracePeriod=30 Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.253535 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vzvbk"] Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.254201 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vzvbk" podUID="1efe898b-dc49-41f9-a296-84f826548896" containerName="registry-server" containerID="cri-o://a282622aa50896a18e685808ad54def38d2975ece4d893c17c637aa1ff4eb1c8" gracePeriod=30 Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.267700 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q72r7"] Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.268046 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q72r7" podUID="fe3cdeec-7336-463c-bbbb-488ece81fa0b" containerName="registry-server" containerID="cri-o://78c3e608babe4cdf0d60330e200910ac080982ae4b52a7bc14a6f3e582767227" gracePeriod=30 Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.275969 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gm65z"] Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.276993 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gm65z" Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.296655 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gm65z"] Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.453630 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d8de44e3-ed07-4c76-8aa8-2265c9cd1805-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gm65z\" (UID: \"d8de44e3-ed07-4c76-8aa8-2265c9cd1805\") " pod="openshift-marketplace/marketplace-operator-79b997595-gm65z" Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.454086 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8de44e3-ed07-4c76-8aa8-2265c9cd1805-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gm65z\" (UID: \"d8de44e3-ed07-4c76-8aa8-2265c9cd1805\") " pod="openshift-marketplace/marketplace-operator-79b997595-gm65z" Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.454131 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29w6h\" (UniqueName: \"kubernetes.io/projected/d8de44e3-ed07-4c76-8aa8-2265c9cd1805-kube-api-access-29w6h\") pod \"marketplace-operator-79b997595-gm65z\" (UID: \"d8de44e3-ed07-4c76-8aa8-2265c9cd1805\") " pod="openshift-marketplace/marketplace-operator-79b997595-gm65z" Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.555439 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8de44e3-ed07-4c76-8aa8-2265c9cd1805-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gm65z\" (UID: \"d8de44e3-ed07-4c76-8aa8-2265c9cd1805\") " pod="openshift-marketplace/marketplace-operator-79b997595-gm65z" Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.555534 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29w6h\" (UniqueName: \"kubernetes.io/projected/d8de44e3-ed07-4c76-8aa8-2265c9cd1805-kube-api-access-29w6h\") pod \"marketplace-operator-79b997595-gm65z\" (UID: \"d8de44e3-ed07-4c76-8aa8-2265c9cd1805\") " pod="openshift-marketplace/marketplace-operator-79b997595-gm65z" Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.555605 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d8de44e3-ed07-4c76-8aa8-2265c9cd1805-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gm65z\" (UID: \"d8de44e3-ed07-4c76-8aa8-2265c9cd1805\") " pod="openshift-marketplace/marketplace-operator-79b997595-gm65z" Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.557529 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8de44e3-ed07-4c76-8aa8-2265c9cd1805-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gm65z\" (UID: \"d8de44e3-ed07-4c76-8aa8-2265c9cd1805\") " pod="openshift-marketplace/marketplace-operator-79b997595-gm65z" Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.569333 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d8de44e3-ed07-4c76-8aa8-2265c9cd1805-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gm65z\" (UID: \"d8de44e3-ed07-4c76-8aa8-2265c9cd1805\") " pod="openshift-marketplace/marketplace-operator-79b997595-gm65z" Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.577607 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29w6h\" (UniqueName: \"kubernetes.io/projected/d8de44e3-ed07-4c76-8aa8-2265c9cd1805-kube-api-access-29w6h\") pod \"marketplace-operator-79b997595-gm65z\" (UID: \"d8de44e3-ed07-4c76-8aa8-2265c9cd1805\") " pod="openshift-marketplace/marketplace-operator-79b997595-gm65z" Jan 10 16:31:23 crc kubenswrapper[5036]: E0110 16:31:23.649400 5036 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a282622aa50896a18e685808ad54def38d2975ece4d893c17c637aa1ff4eb1c8 is running failed: container process not found" containerID="a282622aa50896a18e685808ad54def38d2975ece4d893c17c637aa1ff4eb1c8" cmd=["grpc_health_probe","-addr=:50051"] Jan 10 16:31:23 crc kubenswrapper[5036]: E0110 16:31:23.650736 5036 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a282622aa50896a18e685808ad54def38d2975ece4d893c17c637aa1ff4eb1c8 is running failed: container process not found" containerID="a282622aa50896a18e685808ad54def38d2975ece4d893c17c637aa1ff4eb1c8" cmd=["grpc_health_probe","-addr=:50051"] Jan 10 16:31:23 crc kubenswrapper[5036]: E0110 16:31:23.651417 5036 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a282622aa50896a18e685808ad54def38d2975ece4d893c17c637aa1ff4eb1c8 is running failed: container process not found" containerID="a282622aa50896a18e685808ad54def38d2975ece4d893c17c637aa1ff4eb1c8" cmd=["grpc_health_probe","-addr=:50051"] Jan 10 16:31:23 crc kubenswrapper[5036]: E0110 16:31:23.651524 5036 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a282622aa50896a18e685808ad54def38d2975ece4d893c17c637aa1ff4eb1c8 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-vzvbk" podUID="1efe898b-dc49-41f9-a296-84f826548896" containerName="registry-server" Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.655103 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gm65z" Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.659610 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w55c" Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.861888 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0239b380-03c8-455e-a981-2aaaae000828-catalog-content\") pod \"0239b380-03c8-455e-a981-2aaaae000828\" (UID: \"0239b380-03c8-455e-a981-2aaaae000828\") " Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.862274 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0239b380-03c8-455e-a981-2aaaae000828-utilities\") pod \"0239b380-03c8-455e-a981-2aaaae000828\" (UID: \"0239b380-03c8-455e-a981-2aaaae000828\") " Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.862350 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhh59\" (UniqueName: \"kubernetes.io/projected/0239b380-03c8-455e-a981-2aaaae000828-kube-api-access-jhh59\") pod \"0239b380-03c8-455e-a981-2aaaae000828\" (UID: \"0239b380-03c8-455e-a981-2aaaae000828\") " Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.866651 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0239b380-03c8-455e-a981-2aaaae000828-utilities" (OuterVolumeSpecName: "utilities") pod "0239b380-03c8-455e-a981-2aaaae000828" (UID: "0239b380-03c8-455e-a981-2aaaae000828"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.873374 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0239b380-03c8-455e-a981-2aaaae000828-kube-api-access-jhh59" (OuterVolumeSpecName: "kube-api-access-jhh59") pod "0239b380-03c8-455e-a981-2aaaae000828" (UID: "0239b380-03c8-455e-a981-2aaaae000828"). InnerVolumeSpecName "kube-api-access-jhh59". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.934417 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0239b380-03c8-455e-a981-2aaaae000828-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0239b380-03c8-455e-a981-2aaaae000828" (UID: "0239b380-03c8-455e-a981-2aaaae000828"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.964742 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0239b380-03c8-455e-a981-2aaaae000828-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.964787 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0239b380-03c8-455e-a981-2aaaae000828-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:23 crc kubenswrapper[5036]: I0110 16:31:23.964805 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhh59\" (UniqueName: \"kubernetes.io/projected/0239b380-03c8-455e-a981-2aaaae000828-kube-api-access-jhh59\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.064152 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hm8ns" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.072956 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vzvbk" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.082425 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q72r7" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.088036 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.147289 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gm65z"] Jan 10 16:31:24 crc kubenswrapper[5036]: W0110 16:31:24.160790 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8de44e3_ed07_4c76_8aa8_2265c9cd1805.slice/crio-dd8eaad2b20ec952cb489cb73ac916510769866b219e09535f1899729f3476e1 WatchSource:0}: Error finding container dd8eaad2b20ec952cb489cb73ac916510769866b219e09535f1899729f3476e1: Status 404 returned error can't find the container with id dd8eaad2b20ec952cb489cb73ac916510769866b219e09535f1899729f3476e1 Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.167260 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1513baef-e92c-4399-ae0f-b8fe4a738702-catalog-content\") pod \"1513baef-e92c-4399-ae0f-b8fe4a738702\" (UID: \"1513baef-e92c-4399-ae0f-b8fe4a738702\") " Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.167311 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1efe898b-dc49-41f9-a296-84f826548896-catalog-content\") pod \"1efe898b-dc49-41f9-a296-84f826548896\" (UID: \"1efe898b-dc49-41f9-a296-84f826548896\") " Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.167366 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1efe898b-dc49-41f9-a296-84f826548896-utilities\") pod \"1efe898b-dc49-41f9-a296-84f826548896\" (UID: \"1efe898b-dc49-41f9-a296-84f826548896\") " Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.167408 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wx69v\" (UniqueName: \"kubernetes.io/projected/1efe898b-dc49-41f9-a296-84f826548896-kube-api-access-wx69v\") pod \"1efe898b-dc49-41f9-a296-84f826548896\" (UID: \"1efe898b-dc49-41f9-a296-84f826548896\") " Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.167626 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7q7f7\" (UniqueName: \"kubernetes.io/projected/1513baef-e92c-4399-ae0f-b8fe4a738702-kube-api-access-7q7f7\") pod \"1513baef-e92c-4399-ae0f-b8fe4a738702\" (UID: \"1513baef-e92c-4399-ae0f-b8fe4a738702\") " Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.167663 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1513baef-e92c-4399-ae0f-b8fe4a738702-utilities\") pod \"1513baef-e92c-4399-ae0f-b8fe4a738702\" (UID: \"1513baef-e92c-4399-ae0f-b8fe4a738702\") " Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.169513 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1513baef-e92c-4399-ae0f-b8fe4a738702-utilities" (OuterVolumeSpecName: "utilities") pod "1513baef-e92c-4399-ae0f-b8fe4a738702" (UID: "1513baef-e92c-4399-ae0f-b8fe4a738702"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.170042 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1efe898b-dc49-41f9-a296-84f826548896-utilities" (OuterVolumeSpecName: "utilities") pod "1efe898b-dc49-41f9-a296-84f826548896" (UID: "1efe898b-dc49-41f9-a296-84f826548896"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.170436 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1513baef-e92c-4399-ae0f-b8fe4a738702-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.170488 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1efe898b-dc49-41f9-a296-84f826548896-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.171603 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1513baef-e92c-4399-ae0f-b8fe4a738702-kube-api-access-7q7f7" (OuterVolumeSpecName: "kube-api-access-7q7f7") pod "1513baef-e92c-4399-ae0f-b8fe4a738702" (UID: "1513baef-e92c-4399-ae0f-b8fe4a738702"). InnerVolumeSpecName "kube-api-access-7q7f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.177051 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1efe898b-dc49-41f9-a296-84f826548896-kube-api-access-wx69v" (OuterVolumeSpecName: "kube-api-access-wx69v") pod "1efe898b-dc49-41f9-a296-84f826548896" (UID: "1efe898b-dc49-41f9-a296-84f826548896"). InnerVolumeSpecName "kube-api-access-wx69v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.204235 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1efe898b-dc49-41f9-a296-84f826548896-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1efe898b-dc49-41f9-a296-84f826548896" (UID: "1efe898b-dc49-41f9-a296-84f826548896"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.223443 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1513baef-e92c-4399-ae0f-b8fe4a738702-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1513baef-e92c-4399-ae0f-b8fe4a738702" (UID: "1513baef-e92c-4399-ae0f-b8fe4a738702"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.271450 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe3cdeec-7336-463c-bbbb-488ece81fa0b-utilities\") pod \"fe3cdeec-7336-463c-bbbb-488ece81fa0b\" (UID: \"fe3cdeec-7336-463c-bbbb-488ece81fa0b\") " Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.271510 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gw9w\" (UniqueName: \"kubernetes.io/projected/a0bc40ca-fd61-4885-871b-3a7964df225a-kube-api-access-7gw9w\") pod \"a0bc40ca-fd61-4885-871b-3a7964df225a\" (UID: \"a0bc40ca-fd61-4885-871b-3a7964df225a\") " Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.271575 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0bc40ca-fd61-4885-871b-3a7964df225a-marketplace-trusted-ca\") pod \"a0bc40ca-fd61-4885-871b-3a7964df225a\" (UID: \"a0bc40ca-fd61-4885-871b-3a7964df225a\") " Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.271670 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe3cdeec-7336-463c-bbbb-488ece81fa0b-catalog-content\") pod \"fe3cdeec-7336-463c-bbbb-488ece81fa0b\" (UID: \"fe3cdeec-7336-463c-bbbb-488ece81fa0b\") " Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.271742 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh2wk\" (UniqueName: \"kubernetes.io/projected/fe3cdeec-7336-463c-bbbb-488ece81fa0b-kube-api-access-mh2wk\") pod \"fe3cdeec-7336-463c-bbbb-488ece81fa0b\" (UID: \"fe3cdeec-7336-463c-bbbb-488ece81fa0b\") " Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.271792 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a0bc40ca-fd61-4885-871b-3a7964df225a-marketplace-operator-metrics\") pod \"a0bc40ca-fd61-4885-871b-3a7964df225a\" (UID: \"a0bc40ca-fd61-4885-871b-3a7964df225a\") " Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.272146 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7q7f7\" (UniqueName: \"kubernetes.io/projected/1513baef-e92c-4399-ae0f-b8fe4a738702-kube-api-access-7q7f7\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.272172 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1513baef-e92c-4399-ae0f-b8fe4a738702-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.272187 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1efe898b-dc49-41f9-a296-84f826548896-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.272199 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wx69v\" (UniqueName: \"kubernetes.io/projected/1efe898b-dc49-41f9-a296-84f826548896-kube-api-access-wx69v\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.272980 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0bc40ca-fd61-4885-871b-3a7964df225a-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "a0bc40ca-fd61-4885-871b-3a7964df225a" (UID: "a0bc40ca-fd61-4885-871b-3a7964df225a"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.273440 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe3cdeec-7336-463c-bbbb-488ece81fa0b-utilities" (OuterVolumeSpecName: "utilities") pod "fe3cdeec-7336-463c-bbbb-488ece81fa0b" (UID: "fe3cdeec-7336-463c-bbbb-488ece81fa0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.275478 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe3cdeec-7336-463c-bbbb-488ece81fa0b-kube-api-access-mh2wk" (OuterVolumeSpecName: "kube-api-access-mh2wk") pod "fe3cdeec-7336-463c-bbbb-488ece81fa0b" (UID: "fe3cdeec-7336-463c-bbbb-488ece81fa0b"). InnerVolumeSpecName "kube-api-access-mh2wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.276832 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0bc40ca-fd61-4885-871b-3a7964df225a-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "a0bc40ca-fd61-4885-871b-3a7964df225a" (UID: "a0bc40ca-fd61-4885-871b-3a7964df225a"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.277074 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0bc40ca-fd61-4885-871b-3a7964df225a-kube-api-access-7gw9w" (OuterVolumeSpecName: "kube-api-access-7gw9w") pod "a0bc40ca-fd61-4885-871b-3a7964df225a" (UID: "a0bc40ca-fd61-4885-871b-3a7964df225a"). InnerVolumeSpecName "kube-api-access-7gw9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.299458 5036 generic.go:334] "Generic (PLEG): container finished" podID="1efe898b-dc49-41f9-a296-84f826548896" containerID="a282622aa50896a18e685808ad54def38d2975ece4d893c17c637aa1ff4eb1c8" exitCode=0 Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.299503 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vzvbk" event={"ID":"1efe898b-dc49-41f9-a296-84f826548896","Type":"ContainerDied","Data":"a282622aa50896a18e685808ad54def38d2975ece4d893c17c637aa1ff4eb1c8"} Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.299584 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vzvbk" event={"ID":"1efe898b-dc49-41f9-a296-84f826548896","Type":"ContainerDied","Data":"a27d7332b755469449bfece3cb9968b9e69d3f3e62ebd6cbce90ccfb06a0c787"} Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.299607 5036 scope.go:117] "RemoveContainer" containerID="a282622aa50896a18e685808ad54def38d2975ece4d893c17c637aa1ff4eb1c8" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.299904 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vzvbk" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.303405 5036 generic.go:334] "Generic (PLEG): container finished" podID="a0bc40ca-fd61-4885-871b-3a7964df225a" containerID="9e0cdb67eefdcd3a260e5d5a239677bf20e445084cc794c54479b98cfebc17f8" exitCode=0 Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.303562 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.303578 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" event={"ID":"a0bc40ca-fd61-4885-871b-3a7964df225a","Type":"ContainerDied","Data":"9e0cdb67eefdcd3a260e5d5a239677bf20e445084cc794c54479b98cfebc17f8"} Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.303655 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8k59b" event={"ID":"a0bc40ca-fd61-4885-871b-3a7964df225a","Type":"ContainerDied","Data":"bb28514038fe02be6d06577a766557dc53bd20904fe20e2f017c1728a586a8b4"} Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.307134 5036 generic.go:334] "Generic (PLEG): container finished" podID="1513baef-e92c-4399-ae0f-b8fe4a738702" containerID="66bf421f94760e55957470aa75bef64d0a5e3876c85a0cb5563735e8e71a03e0" exitCode=0 Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.307227 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hm8ns" event={"ID":"1513baef-e92c-4399-ae0f-b8fe4a738702","Type":"ContainerDied","Data":"66bf421f94760e55957470aa75bef64d0a5e3876c85a0cb5563735e8e71a03e0"} Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.307229 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hm8ns" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.307263 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hm8ns" event={"ID":"1513baef-e92c-4399-ae0f-b8fe4a738702","Type":"ContainerDied","Data":"690c5256f10be1e066e92ef9c938a2d768a8ae9899e67eadf97429cc99ff4a92"} Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.311374 5036 generic.go:334] "Generic (PLEG): container finished" podID="0239b380-03c8-455e-a981-2aaaae000828" containerID="6f4053a3a05ce1f148133370d38c55b4aef276f6896d87d35d0ee0ceab03a1b0" exitCode=0 Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.311563 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w55c" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.311610 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w55c" event={"ID":"0239b380-03c8-455e-a981-2aaaae000828","Type":"ContainerDied","Data":"6f4053a3a05ce1f148133370d38c55b4aef276f6896d87d35d0ee0ceab03a1b0"} Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.311702 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w55c" event={"ID":"0239b380-03c8-455e-a981-2aaaae000828","Type":"ContainerDied","Data":"60b567199e6d9c4d632b9da33745bc57c86d7c89b57d95171070f1869d46c2a8"} Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.316220 5036 generic.go:334] "Generic (PLEG): container finished" podID="fe3cdeec-7336-463c-bbbb-488ece81fa0b" containerID="78c3e608babe4cdf0d60330e200910ac080982ae4b52a7bc14a6f3e582767227" exitCode=0 Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.316393 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q72r7" event={"ID":"fe3cdeec-7336-463c-bbbb-488ece81fa0b","Type":"ContainerDied","Data":"78c3e608babe4cdf0d60330e200910ac080982ae4b52a7bc14a6f3e582767227"} Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.316431 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q72r7" event={"ID":"fe3cdeec-7336-463c-bbbb-488ece81fa0b","Type":"ContainerDied","Data":"40913a4a3d0e246e3ec19749b2e2cdb4156de220cccc3b6fd6bccdc2b64803d5"} Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.316530 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q72r7" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.321371 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gm65z" event={"ID":"d8de44e3-ed07-4c76-8aa8-2265c9cd1805","Type":"ContainerStarted","Data":"dd8eaad2b20ec952cb489cb73ac916510769866b219e09535f1899729f3476e1"} Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.321713 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gm65z" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.325340 5036 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gm65z container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.69:8080/healthz\": dial tcp 10.217.0.69:8080: connect: connection refused" start-of-body= Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.325397 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gm65z" podUID="d8de44e3-ed07-4c76-8aa8-2265c9cd1805" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.69:8080/healthz\": dial tcp 10.217.0.69:8080: connect: connection refused" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.325802 5036 scope.go:117] "RemoveContainer" containerID="7360f60e1d34a27bd4e6b2f5d2b89d35233c1f715b63593b09107486627a56e6" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.363650 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gm65z" podStartSLOduration=1.363625786 podStartE2EDuration="1.363625786s" podCreationTimestamp="2026-01-10 16:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:31:24.362646477 +0000 UTC m=+206.232881991" watchObservedRunningTime="2026-01-10 16:31:24.363625786 +0000 UTC m=+206.233861280" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.377060 5036 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a0bc40ca-fd61-4885-871b-3a7964df225a-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.377103 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe3cdeec-7336-463c-bbbb-488ece81fa0b-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.377116 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gw9w\" (UniqueName: \"kubernetes.io/projected/a0bc40ca-fd61-4885-871b-3a7964df225a-kube-api-access-7gw9w\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.377133 5036 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a0bc40ca-fd61-4885-871b-3a7964df225a-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.377235 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh2wk\" (UniqueName: \"kubernetes.io/projected/fe3cdeec-7336-463c-bbbb-488ece81fa0b-kube-api-access-mh2wk\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.412512 5036 scope.go:117] "RemoveContainer" containerID="ef1939da0fd9a512cc6ff925d37bfe70051f6b8265dff59f6a5dcf0c7aaf8574" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.413249 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vzvbk"] Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.419907 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vzvbk"] Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.430039 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8k59b"] Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.435247 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8k59b"] Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.457945 5036 scope.go:117] "RemoveContainer" containerID="a282622aa50896a18e685808ad54def38d2975ece4d893c17c637aa1ff4eb1c8" Jan 10 16:31:24 crc kubenswrapper[5036]: E0110 16:31:24.459824 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a282622aa50896a18e685808ad54def38d2975ece4d893c17c637aa1ff4eb1c8\": container with ID starting with a282622aa50896a18e685808ad54def38d2975ece4d893c17c637aa1ff4eb1c8 not found: ID does not exist" containerID="a282622aa50896a18e685808ad54def38d2975ece4d893c17c637aa1ff4eb1c8" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.459860 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a282622aa50896a18e685808ad54def38d2975ece4d893c17c637aa1ff4eb1c8"} err="failed to get container status \"a282622aa50896a18e685808ad54def38d2975ece4d893c17c637aa1ff4eb1c8\": rpc error: code = NotFound desc = could not find container \"a282622aa50896a18e685808ad54def38d2975ece4d893c17c637aa1ff4eb1c8\": container with ID starting with a282622aa50896a18e685808ad54def38d2975ece4d893c17c637aa1ff4eb1c8 not found: ID does not exist" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.459885 5036 scope.go:117] "RemoveContainer" containerID="7360f60e1d34a27bd4e6b2f5d2b89d35233c1f715b63593b09107486627a56e6" Jan 10 16:31:24 crc kubenswrapper[5036]: E0110 16:31:24.461345 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7360f60e1d34a27bd4e6b2f5d2b89d35233c1f715b63593b09107486627a56e6\": container with ID starting with 7360f60e1d34a27bd4e6b2f5d2b89d35233c1f715b63593b09107486627a56e6 not found: ID does not exist" containerID="7360f60e1d34a27bd4e6b2f5d2b89d35233c1f715b63593b09107486627a56e6" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.461371 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7360f60e1d34a27bd4e6b2f5d2b89d35233c1f715b63593b09107486627a56e6"} err="failed to get container status \"7360f60e1d34a27bd4e6b2f5d2b89d35233c1f715b63593b09107486627a56e6\": rpc error: code = NotFound desc = could not find container \"7360f60e1d34a27bd4e6b2f5d2b89d35233c1f715b63593b09107486627a56e6\": container with ID starting with 7360f60e1d34a27bd4e6b2f5d2b89d35233c1f715b63593b09107486627a56e6 not found: ID does not exist" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.461388 5036 scope.go:117] "RemoveContainer" containerID="ef1939da0fd9a512cc6ff925d37bfe70051f6b8265dff59f6a5dcf0c7aaf8574" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.461623 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hm8ns"] Jan 10 16:31:24 crc kubenswrapper[5036]: E0110 16:31:24.461883 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef1939da0fd9a512cc6ff925d37bfe70051f6b8265dff59f6a5dcf0c7aaf8574\": container with ID starting with ef1939da0fd9a512cc6ff925d37bfe70051f6b8265dff59f6a5dcf0c7aaf8574 not found: ID does not exist" containerID="ef1939da0fd9a512cc6ff925d37bfe70051f6b8265dff59f6a5dcf0c7aaf8574" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.461919 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef1939da0fd9a512cc6ff925d37bfe70051f6b8265dff59f6a5dcf0c7aaf8574"} err="failed to get container status \"ef1939da0fd9a512cc6ff925d37bfe70051f6b8265dff59f6a5dcf0c7aaf8574\": rpc error: code = NotFound desc = could not find container \"ef1939da0fd9a512cc6ff925d37bfe70051f6b8265dff59f6a5dcf0c7aaf8574\": container with ID starting with ef1939da0fd9a512cc6ff925d37bfe70051f6b8265dff59f6a5dcf0c7aaf8574 not found: ID does not exist" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.461938 5036 scope.go:117] "RemoveContainer" containerID="9e0cdb67eefdcd3a260e5d5a239677bf20e445084cc794c54479b98cfebc17f8" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.466136 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe3cdeec-7336-463c-bbbb-488ece81fa0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fe3cdeec-7336-463c-bbbb-488ece81fa0b" (UID: "fe3cdeec-7336-463c-bbbb-488ece81fa0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.466196 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hm8ns"] Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.475799 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9w55c"] Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.475878 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9w55c"] Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.478199 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe3cdeec-7336-463c-bbbb-488ece81fa0b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.483786 5036 scope.go:117] "RemoveContainer" containerID="6b87c309cac13c46edddfa5e185e84be2b2002e281913a2591d10279b59e7b6d" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.497144 5036 scope.go:117] "RemoveContainer" containerID="9e0cdb67eefdcd3a260e5d5a239677bf20e445084cc794c54479b98cfebc17f8" Jan 10 16:31:24 crc kubenswrapper[5036]: E0110 16:31:24.497617 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e0cdb67eefdcd3a260e5d5a239677bf20e445084cc794c54479b98cfebc17f8\": container with ID starting with 9e0cdb67eefdcd3a260e5d5a239677bf20e445084cc794c54479b98cfebc17f8 not found: ID does not exist" containerID="9e0cdb67eefdcd3a260e5d5a239677bf20e445084cc794c54479b98cfebc17f8" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.497668 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e0cdb67eefdcd3a260e5d5a239677bf20e445084cc794c54479b98cfebc17f8"} err="failed to get container status \"9e0cdb67eefdcd3a260e5d5a239677bf20e445084cc794c54479b98cfebc17f8\": rpc error: code = NotFound desc = could not find container \"9e0cdb67eefdcd3a260e5d5a239677bf20e445084cc794c54479b98cfebc17f8\": container with ID starting with 9e0cdb67eefdcd3a260e5d5a239677bf20e445084cc794c54479b98cfebc17f8 not found: ID does not exist" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.497732 5036 scope.go:117] "RemoveContainer" containerID="6b87c309cac13c46edddfa5e185e84be2b2002e281913a2591d10279b59e7b6d" Jan 10 16:31:24 crc kubenswrapper[5036]: E0110 16:31:24.498144 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b87c309cac13c46edddfa5e185e84be2b2002e281913a2591d10279b59e7b6d\": container with ID starting with 6b87c309cac13c46edddfa5e185e84be2b2002e281913a2591d10279b59e7b6d not found: ID does not exist" containerID="6b87c309cac13c46edddfa5e185e84be2b2002e281913a2591d10279b59e7b6d" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.498173 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b87c309cac13c46edddfa5e185e84be2b2002e281913a2591d10279b59e7b6d"} err="failed to get container status \"6b87c309cac13c46edddfa5e185e84be2b2002e281913a2591d10279b59e7b6d\": rpc error: code = NotFound desc = could not find container \"6b87c309cac13c46edddfa5e185e84be2b2002e281913a2591d10279b59e7b6d\": container with ID starting with 6b87c309cac13c46edddfa5e185e84be2b2002e281913a2591d10279b59e7b6d not found: ID does not exist" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.498191 5036 scope.go:117] "RemoveContainer" containerID="66bf421f94760e55957470aa75bef64d0a5e3876c85a0cb5563735e8e71a03e0" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.514132 5036 scope.go:117] "RemoveContainer" containerID="160b1bef80c23df20bbcfc9cab886dc4708b607b4118e3856a5b3d0f4d9299e8" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.515103 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0239b380-03c8-455e-a981-2aaaae000828" path="/var/lib/kubelet/pods/0239b380-03c8-455e-a981-2aaaae000828/volumes" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.515769 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1513baef-e92c-4399-ae0f-b8fe4a738702" path="/var/lib/kubelet/pods/1513baef-e92c-4399-ae0f-b8fe4a738702/volumes" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.516342 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1efe898b-dc49-41f9-a296-84f826548896" path="/var/lib/kubelet/pods/1efe898b-dc49-41f9-a296-84f826548896/volumes" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.517413 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0bc40ca-fd61-4885-871b-3a7964df225a" path="/var/lib/kubelet/pods/a0bc40ca-fd61-4885-871b-3a7964df225a/volumes" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.530761 5036 scope.go:117] "RemoveContainer" containerID="da41d0c8c9abc1724e8899212a0c4e949a52e45eaf87f1bc01623bb61e9c5909" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.547641 5036 scope.go:117] "RemoveContainer" containerID="66bf421f94760e55957470aa75bef64d0a5e3876c85a0cb5563735e8e71a03e0" Jan 10 16:31:24 crc kubenswrapper[5036]: E0110 16:31:24.548137 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66bf421f94760e55957470aa75bef64d0a5e3876c85a0cb5563735e8e71a03e0\": container with ID starting with 66bf421f94760e55957470aa75bef64d0a5e3876c85a0cb5563735e8e71a03e0 not found: ID does not exist" containerID="66bf421f94760e55957470aa75bef64d0a5e3876c85a0cb5563735e8e71a03e0" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.548170 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66bf421f94760e55957470aa75bef64d0a5e3876c85a0cb5563735e8e71a03e0"} err="failed to get container status \"66bf421f94760e55957470aa75bef64d0a5e3876c85a0cb5563735e8e71a03e0\": rpc error: code = NotFound desc = could not find container \"66bf421f94760e55957470aa75bef64d0a5e3876c85a0cb5563735e8e71a03e0\": container with ID starting with 66bf421f94760e55957470aa75bef64d0a5e3876c85a0cb5563735e8e71a03e0 not found: ID does not exist" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.548193 5036 scope.go:117] "RemoveContainer" containerID="160b1bef80c23df20bbcfc9cab886dc4708b607b4118e3856a5b3d0f4d9299e8" Jan 10 16:31:24 crc kubenswrapper[5036]: E0110 16:31:24.548609 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"160b1bef80c23df20bbcfc9cab886dc4708b607b4118e3856a5b3d0f4d9299e8\": container with ID starting with 160b1bef80c23df20bbcfc9cab886dc4708b607b4118e3856a5b3d0f4d9299e8 not found: ID does not exist" containerID="160b1bef80c23df20bbcfc9cab886dc4708b607b4118e3856a5b3d0f4d9299e8" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.548632 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"160b1bef80c23df20bbcfc9cab886dc4708b607b4118e3856a5b3d0f4d9299e8"} err="failed to get container status \"160b1bef80c23df20bbcfc9cab886dc4708b607b4118e3856a5b3d0f4d9299e8\": rpc error: code = NotFound desc = could not find container \"160b1bef80c23df20bbcfc9cab886dc4708b607b4118e3856a5b3d0f4d9299e8\": container with ID starting with 160b1bef80c23df20bbcfc9cab886dc4708b607b4118e3856a5b3d0f4d9299e8 not found: ID does not exist" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.548646 5036 scope.go:117] "RemoveContainer" containerID="da41d0c8c9abc1724e8899212a0c4e949a52e45eaf87f1bc01623bb61e9c5909" Jan 10 16:31:24 crc kubenswrapper[5036]: E0110 16:31:24.550368 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da41d0c8c9abc1724e8899212a0c4e949a52e45eaf87f1bc01623bb61e9c5909\": container with ID starting with da41d0c8c9abc1724e8899212a0c4e949a52e45eaf87f1bc01623bb61e9c5909 not found: ID does not exist" containerID="da41d0c8c9abc1724e8899212a0c4e949a52e45eaf87f1bc01623bb61e9c5909" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.550391 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da41d0c8c9abc1724e8899212a0c4e949a52e45eaf87f1bc01623bb61e9c5909"} err="failed to get container status \"da41d0c8c9abc1724e8899212a0c4e949a52e45eaf87f1bc01623bb61e9c5909\": rpc error: code = NotFound desc = could not find container \"da41d0c8c9abc1724e8899212a0c4e949a52e45eaf87f1bc01623bb61e9c5909\": container with ID starting with da41d0c8c9abc1724e8899212a0c4e949a52e45eaf87f1bc01623bb61e9c5909 not found: ID does not exist" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.550407 5036 scope.go:117] "RemoveContainer" containerID="6f4053a3a05ce1f148133370d38c55b4aef276f6896d87d35d0ee0ceab03a1b0" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.565150 5036 scope.go:117] "RemoveContainer" containerID="809d23e190d035e9e01b9e73a924074d0a081f50ab385fd0e747c36895f150db" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.585858 5036 scope.go:117] "RemoveContainer" containerID="472d2ac145d9668cd2ba2b48df2dc9c9521fd0bd8c638a0713a3dae19379b753" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.604943 5036 scope.go:117] "RemoveContainer" containerID="6f4053a3a05ce1f148133370d38c55b4aef276f6896d87d35d0ee0ceab03a1b0" Jan 10 16:31:24 crc kubenswrapper[5036]: E0110 16:31:24.606553 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f4053a3a05ce1f148133370d38c55b4aef276f6896d87d35d0ee0ceab03a1b0\": container with ID starting with 6f4053a3a05ce1f148133370d38c55b4aef276f6896d87d35d0ee0ceab03a1b0 not found: ID does not exist" containerID="6f4053a3a05ce1f148133370d38c55b4aef276f6896d87d35d0ee0ceab03a1b0" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.606703 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f4053a3a05ce1f148133370d38c55b4aef276f6896d87d35d0ee0ceab03a1b0"} err="failed to get container status \"6f4053a3a05ce1f148133370d38c55b4aef276f6896d87d35d0ee0ceab03a1b0\": rpc error: code = NotFound desc = could not find container \"6f4053a3a05ce1f148133370d38c55b4aef276f6896d87d35d0ee0ceab03a1b0\": container with ID starting with 6f4053a3a05ce1f148133370d38c55b4aef276f6896d87d35d0ee0ceab03a1b0 not found: ID does not exist" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.608039 5036 scope.go:117] "RemoveContainer" containerID="809d23e190d035e9e01b9e73a924074d0a081f50ab385fd0e747c36895f150db" Jan 10 16:31:24 crc kubenswrapper[5036]: E0110 16:31:24.608761 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"809d23e190d035e9e01b9e73a924074d0a081f50ab385fd0e747c36895f150db\": container with ID starting with 809d23e190d035e9e01b9e73a924074d0a081f50ab385fd0e747c36895f150db not found: ID does not exist" containerID="809d23e190d035e9e01b9e73a924074d0a081f50ab385fd0e747c36895f150db" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.608854 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"809d23e190d035e9e01b9e73a924074d0a081f50ab385fd0e747c36895f150db"} err="failed to get container status \"809d23e190d035e9e01b9e73a924074d0a081f50ab385fd0e747c36895f150db\": rpc error: code = NotFound desc = could not find container \"809d23e190d035e9e01b9e73a924074d0a081f50ab385fd0e747c36895f150db\": container with ID starting with 809d23e190d035e9e01b9e73a924074d0a081f50ab385fd0e747c36895f150db not found: ID does not exist" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.608939 5036 scope.go:117] "RemoveContainer" containerID="472d2ac145d9668cd2ba2b48df2dc9c9521fd0bd8c638a0713a3dae19379b753" Jan 10 16:31:24 crc kubenswrapper[5036]: E0110 16:31:24.609510 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"472d2ac145d9668cd2ba2b48df2dc9c9521fd0bd8c638a0713a3dae19379b753\": container with ID starting with 472d2ac145d9668cd2ba2b48df2dc9c9521fd0bd8c638a0713a3dae19379b753 not found: ID does not exist" containerID="472d2ac145d9668cd2ba2b48df2dc9c9521fd0bd8c638a0713a3dae19379b753" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.609546 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"472d2ac145d9668cd2ba2b48df2dc9c9521fd0bd8c638a0713a3dae19379b753"} err="failed to get container status \"472d2ac145d9668cd2ba2b48df2dc9c9521fd0bd8c638a0713a3dae19379b753\": rpc error: code = NotFound desc = could not find container \"472d2ac145d9668cd2ba2b48df2dc9c9521fd0bd8c638a0713a3dae19379b753\": container with ID starting with 472d2ac145d9668cd2ba2b48df2dc9c9521fd0bd8c638a0713a3dae19379b753 not found: ID does not exist" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.609572 5036 scope.go:117] "RemoveContainer" containerID="78c3e608babe4cdf0d60330e200910ac080982ae4b52a7bc14a6f3e582767227" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.635068 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q72r7"] Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.635298 5036 scope.go:117] "RemoveContainer" containerID="7fc831a54a1f69526e6455d38235ba018de24daca5ed5ba16eca3058458ca6f6" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.638575 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q72r7"] Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.650708 5036 scope.go:117] "RemoveContainer" containerID="31769f7fa39ebd466438bf4d712640900fc059fe58448a431ca9455f750bf032" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.673226 5036 scope.go:117] "RemoveContainer" containerID="78c3e608babe4cdf0d60330e200910ac080982ae4b52a7bc14a6f3e582767227" Jan 10 16:31:24 crc kubenswrapper[5036]: E0110 16:31:24.673796 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78c3e608babe4cdf0d60330e200910ac080982ae4b52a7bc14a6f3e582767227\": container with ID starting with 78c3e608babe4cdf0d60330e200910ac080982ae4b52a7bc14a6f3e582767227 not found: ID does not exist" containerID="78c3e608babe4cdf0d60330e200910ac080982ae4b52a7bc14a6f3e582767227" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.673838 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78c3e608babe4cdf0d60330e200910ac080982ae4b52a7bc14a6f3e582767227"} err="failed to get container status \"78c3e608babe4cdf0d60330e200910ac080982ae4b52a7bc14a6f3e582767227\": rpc error: code = NotFound desc = could not find container \"78c3e608babe4cdf0d60330e200910ac080982ae4b52a7bc14a6f3e582767227\": container with ID starting with 78c3e608babe4cdf0d60330e200910ac080982ae4b52a7bc14a6f3e582767227 not found: ID does not exist" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.673861 5036 scope.go:117] "RemoveContainer" containerID="7fc831a54a1f69526e6455d38235ba018de24daca5ed5ba16eca3058458ca6f6" Jan 10 16:31:24 crc kubenswrapper[5036]: E0110 16:31:24.674114 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fc831a54a1f69526e6455d38235ba018de24daca5ed5ba16eca3058458ca6f6\": container with ID starting with 7fc831a54a1f69526e6455d38235ba018de24daca5ed5ba16eca3058458ca6f6 not found: ID does not exist" containerID="7fc831a54a1f69526e6455d38235ba018de24daca5ed5ba16eca3058458ca6f6" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.674164 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fc831a54a1f69526e6455d38235ba018de24daca5ed5ba16eca3058458ca6f6"} err="failed to get container status \"7fc831a54a1f69526e6455d38235ba018de24daca5ed5ba16eca3058458ca6f6\": rpc error: code = NotFound desc = could not find container \"7fc831a54a1f69526e6455d38235ba018de24daca5ed5ba16eca3058458ca6f6\": container with ID starting with 7fc831a54a1f69526e6455d38235ba018de24daca5ed5ba16eca3058458ca6f6 not found: ID does not exist" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.674185 5036 scope.go:117] "RemoveContainer" containerID="31769f7fa39ebd466438bf4d712640900fc059fe58448a431ca9455f750bf032" Jan 10 16:31:24 crc kubenswrapper[5036]: E0110 16:31:24.674893 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31769f7fa39ebd466438bf4d712640900fc059fe58448a431ca9455f750bf032\": container with ID starting with 31769f7fa39ebd466438bf4d712640900fc059fe58448a431ca9455f750bf032 not found: ID does not exist" containerID="31769f7fa39ebd466438bf4d712640900fc059fe58448a431ca9455f750bf032" Jan 10 16:31:24 crc kubenswrapper[5036]: I0110 16:31:24.674918 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31769f7fa39ebd466438bf4d712640900fc059fe58448a431ca9455f750bf032"} err="failed to get container status \"31769f7fa39ebd466438bf4d712640900fc059fe58448a431ca9455f750bf032\": rpc error: code = NotFound desc = could not find container \"31769f7fa39ebd466438bf4d712640900fc059fe58448a431ca9455f750bf032\": container with ID starting with 31769f7fa39ebd466438bf4d712640900fc059fe58448a431ca9455f750bf032 not found: ID does not exist" Jan 10 16:31:25 crc kubenswrapper[5036]: I0110 16:31:25.187418 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 10 16:31:25 crc kubenswrapper[5036]: I0110 16:31:25.328858 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gm65z" event={"ID":"d8de44e3-ed07-4c76-8aa8-2265c9cd1805","Type":"ContainerStarted","Data":"1d692a31974d6c1236d4e9d745580e74e57d88e5744403d46118a4886e63e188"} Jan 10 16:31:25 crc kubenswrapper[5036]: I0110 16:31:25.332358 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gm65z" Jan 10 16:31:25 crc kubenswrapper[5036]: I0110 16:31:25.904259 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 16:31:25 crc kubenswrapper[5036]: I0110 16:31:25.904319 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 16:31:25 crc kubenswrapper[5036]: I0110 16:31:25.904362 5036 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 16:31:25 crc kubenswrapper[5036]: I0110 16:31:25.904941 5036 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aae30e525ba7b9a8f43d42033f9ba0d3065ee1415e836584cee9ed215de60e5f"} pod="openshift-machine-config-operator/machine-config-daemon-kqphb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 10 16:31:25 crc kubenswrapper[5036]: I0110 16:31:25.905004 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" containerID="cri-o://aae30e525ba7b9a8f43d42033f9ba0d3065ee1415e836584cee9ed215de60e5f" gracePeriod=600 Jan 10 16:31:26 crc kubenswrapper[5036]: I0110 16:31:26.516368 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe3cdeec-7336-463c-bbbb-488ece81fa0b" path="/var/lib/kubelet/pods/fe3cdeec-7336-463c-bbbb-488ece81fa0b/volumes" Jan 10 16:31:27 crc kubenswrapper[5036]: I0110 16:31:27.346328 5036 generic.go:334] "Generic (PLEG): container finished" podID="79756361-741e-4470-831b-6ee092bc6277" containerID="aae30e525ba7b9a8f43d42033f9ba0d3065ee1415e836584cee9ed215de60e5f" exitCode=0 Jan 10 16:31:27 crc kubenswrapper[5036]: I0110 16:31:27.346985 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerDied","Data":"aae30e525ba7b9a8f43d42033f9ba0d3065ee1415e836584cee9ed215de60e5f"} Jan 10 16:31:28 crc kubenswrapper[5036]: I0110 16:31:28.354263 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerStarted","Data":"af7debfeb8d3a1dfa2638975b895daa7ecdb2dc663d2c78b9975fbe6f240f10a"} Jan 10 16:31:55 crc kubenswrapper[5036]: I0110 16:31:55.566152 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz"] Jan 10 16:31:55 crc kubenswrapper[5036]: I0110 16:31:55.566964 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz" podUID="c2f5ae2e-140f-4209-91fd-ea887aa999c3" containerName="route-controller-manager" containerID="cri-o://14e6c8c424350661902e0f47696af20492dffc36efb24cf76c37e314de469bed" gracePeriod=30 Jan 10 16:31:55 crc kubenswrapper[5036]: I0110 16:31:55.758714 5036 generic.go:334] "Generic (PLEG): container finished" podID="c2f5ae2e-140f-4209-91fd-ea887aa999c3" containerID="14e6c8c424350661902e0f47696af20492dffc36efb24cf76c37e314de469bed" exitCode=0 Jan 10 16:31:55 crc kubenswrapper[5036]: I0110 16:31:55.758761 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz" event={"ID":"c2f5ae2e-140f-4209-91fd-ea887aa999c3","Type":"ContainerDied","Data":"14e6c8c424350661902e0f47696af20492dffc36efb24cf76c37e314de469bed"} Jan 10 16:31:55 crc kubenswrapper[5036]: I0110 16:31:55.925478 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz" Jan 10 16:31:55 crc kubenswrapper[5036]: I0110 16:31:55.941286 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2f5ae2e-140f-4209-91fd-ea887aa999c3-serving-cert\") pod \"c2f5ae2e-140f-4209-91fd-ea887aa999c3\" (UID: \"c2f5ae2e-140f-4209-91fd-ea887aa999c3\") " Jan 10 16:31:55 crc kubenswrapper[5036]: I0110 16:31:55.941356 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2f5ae2e-140f-4209-91fd-ea887aa999c3-config\") pod \"c2f5ae2e-140f-4209-91fd-ea887aa999c3\" (UID: \"c2f5ae2e-140f-4209-91fd-ea887aa999c3\") " Jan 10 16:31:55 crc kubenswrapper[5036]: I0110 16:31:55.941411 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkgrh\" (UniqueName: \"kubernetes.io/projected/c2f5ae2e-140f-4209-91fd-ea887aa999c3-kube-api-access-jkgrh\") pod \"c2f5ae2e-140f-4209-91fd-ea887aa999c3\" (UID: \"c2f5ae2e-140f-4209-91fd-ea887aa999c3\") " Jan 10 16:31:55 crc kubenswrapper[5036]: I0110 16:31:55.941441 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2f5ae2e-140f-4209-91fd-ea887aa999c3-client-ca\") pod \"c2f5ae2e-140f-4209-91fd-ea887aa999c3\" (UID: \"c2f5ae2e-140f-4209-91fd-ea887aa999c3\") " Jan 10 16:31:55 crc kubenswrapper[5036]: I0110 16:31:55.945399 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2f5ae2e-140f-4209-91fd-ea887aa999c3-config" (OuterVolumeSpecName: "config") pod "c2f5ae2e-140f-4209-91fd-ea887aa999c3" (UID: "c2f5ae2e-140f-4209-91fd-ea887aa999c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:31:55 crc kubenswrapper[5036]: I0110 16:31:55.952004 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2f5ae2e-140f-4209-91fd-ea887aa999c3-client-ca" (OuterVolumeSpecName: "client-ca") pod "c2f5ae2e-140f-4209-91fd-ea887aa999c3" (UID: "c2f5ae2e-140f-4209-91fd-ea887aa999c3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:31:55 crc kubenswrapper[5036]: I0110 16:31:55.966979 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f5ae2e-140f-4209-91fd-ea887aa999c3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c2f5ae2e-140f-4209-91fd-ea887aa999c3" (UID: "c2f5ae2e-140f-4209-91fd-ea887aa999c3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:31:55 crc kubenswrapper[5036]: I0110 16:31:55.967036 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2f5ae2e-140f-4209-91fd-ea887aa999c3-kube-api-access-jkgrh" (OuterVolumeSpecName: "kube-api-access-jkgrh") pod "c2f5ae2e-140f-4209-91fd-ea887aa999c3" (UID: "c2f5ae2e-140f-4209-91fd-ea887aa999c3"). InnerVolumeSpecName "kube-api-access-jkgrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.042645 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2f5ae2e-140f-4209-91fd-ea887aa999c3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.042717 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2f5ae2e-140f-4209-91fd-ea887aa999c3-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.042729 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkgrh\" (UniqueName: \"kubernetes.io/projected/c2f5ae2e-140f-4209-91fd-ea887aa999c3-kube-api-access-jkgrh\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.042741 5036 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2f5ae2e-140f-4209-91fd-ea887aa999c3-client-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.765494 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz" event={"ID":"c2f5ae2e-140f-4209-91fd-ea887aa999c3","Type":"ContainerDied","Data":"5886857464e25327312d6be807485cd014b45833664653794aa4416048c2b1b8"} Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.765558 5036 scope.go:117] "RemoveContainer" containerID="14e6c8c424350661902e0f47696af20492dffc36efb24cf76c37e314de469bed" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.765566 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.781262 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz"] Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.785171 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6bcbf77d-2g2nz"] Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.894781 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85d4fcb4f-tfmq7"] Jan 10 16:31:56 crc kubenswrapper[5036]: E0110 16:31:56.895131 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efe898b-dc49-41f9-a296-84f826548896" containerName="extract-content" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.895169 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efe898b-dc49-41f9-a296-84f826548896" containerName="extract-content" Jan 10 16:31:56 crc kubenswrapper[5036]: E0110 16:31:56.895196 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe3cdeec-7336-463c-bbbb-488ece81fa0b" containerName="registry-server" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.895214 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe3cdeec-7336-463c-bbbb-488ece81fa0b" containerName="registry-server" Jan 10 16:31:56 crc kubenswrapper[5036]: E0110 16:31:56.895243 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0239b380-03c8-455e-a981-2aaaae000828" containerName="extract-utilities" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.895261 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="0239b380-03c8-455e-a981-2aaaae000828" containerName="extract-utilities" Jan 10 16:31:56 crc kubenswrapper[5036]: E0110 16:31:56.895280 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0bc40ca-fd61-4885-871b-3a7964df225a" containerName="marketplace-operator" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.895295 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0bc40ca-fd61-4885-871b-3a7964df225a" containerName="marketplace-operator" Jan 10 16:31:56 crc kubenswrapper[5036]: E0110 16:31:56.895312 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1513baef-e92c-4399-ae0f-b8fe4a738702" containerName="extract-utilities" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.895327 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="1513baef-e92c-4399-ae0f-b8fe4a738702" containerName="extract-utilities" Jan 10 16:31:56 crc kubenswrapper[5036]: E0110 16:31:56.895373 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1513baef-e92c-4399-ae0f-b8fe4a738702" containerName="extract-content" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.895391 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="1513baef-e92c-4399-ae0f-b8fe4a738702" containerName="extract-content" Jan 10 16:31:56 crc kubenswrapper[5036]: E0110 16:31:56.895409 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe3cdeec-7336-463c-bbbb-488ece81fa0b" containerName="extract-content" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.895425 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe3cdeec-7336-463c-bbbb-488ece81fa0b" containerName="extract-content" Jan 10 16:31:56 crc kubenswrapper[5036]: E0110 16:31:56.895447 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1513baef-e92c-4399-ae0f-b8fe4a738702" containerName="registry-server" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.895462 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="1513baef-e92c-4399-ae0f-b8fe4a738702" containerName="registry-server" Jan 10 16:31:56 crc kubenswrapper[5036]: E0110 16:31:56.895482 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0239b380-03c8-455e-a981-2aaaae000828" containerName="registry-server" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.895501 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="0239b380-03c8-455e-a981-2aaaae000828" containerName="registry-server" Jan 10 16:31:56 crc kubenswrapper[5036]: E0110 16:31:56.895527 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2f5ae2e-140f-4209-91fd-ea887aa999c3" containerName="route-controller-manager" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.895544 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f5ae2e-140f-4209-91fd-ea887aa999c3" containerName="route-controller-manager" Jan 10 16:31:56 crc kubenswrapper[5036]: E0110 16:31:56.895568 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe3cdeec-7336-463c-bbbb-488ece81fa0b" containerName="extract-utilities" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.895584 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe3cdeec-7336-463c-bbbb-488ece81fa0b" containerName="extract-utilities" Jan 10 16:31:56 crc kubenswrapper[5036]: E0110 16:31:56.895604 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0239b380-03c8-455e-a981-2aaaae000828" containerName="extract-content" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.895620 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="0239b380-03c8-455e-a981-2aaaae000828" containerName="extract-content" Jan 10 16:31:56 crc kubenswrapper[5036]: E0110 16:31:56.895641 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efe898b-dc49-41f9-a296-84f826548896" containerName="registry-server" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.895656 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efe898b-dc49-41f9-a296-84f826548896" containerName="registry-server" Jan 10 16:31:56 crc kubenswrapper[5036]: E0110 16:31:56.895676 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0bc40ca-fd61-4885-871b-3a7964df225a" containerName="marketplace-operator" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.895723 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0bc40ca-fd61-4885-871b-3a7964df225a" containerName="marketplace-operator" Jan 10 16:31:56 crc kubenswrapper[5036]: E0110 16:31:56.895747 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1efe898b-dc49-41f9-a296-84f826548896" containerName="extract-utilities" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.895763 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="1efe898b-dc49-41f9-a296-84f826548896" containerName="extract-utilities" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.895964 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="0239b380-03c8-455e-a981-2aaaae000828" containerName="registry-server" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.895994 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe3cdeec-7336-463c-bbbb-488ece81fa0b" containerName="registry-server" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.896014 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="1efe898b-dc49-41f9-a296-84f826548896" containerName="registry-server" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.896035 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0bc40ca-fd61-4885-871b-3a7964df225a" containerName="marketplace-operator" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.896061 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0bc40ca-fd61-4885-871b-3a7964df225a" containerName="marketplace-operator" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.896085 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="1513baef-e92c-4399-ae0f-b8fe4a738702" containerName="registry-server" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.896101 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2f5ae2e-140f-4209-91fd-ea887aa999c3" containerName="route-controller-manager" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.896887 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-tfmq7" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.899632 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.899979 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.901584 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.901746 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.902012 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.904752 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.907232 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85d4fcb4f-tfmq7"] Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.954771 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/624ab987-72ff-4c5f-9e6a-acaed1a5b0ee-client-ca\") pod \"route-controller-manager-85d4fcb4f-tfmq7\" (UID: \"624ab987-72ff-4c5f-9e6a-acaed1a5b0ee\") " pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-tfmq7" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.954860 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk8x7\" (UniqueName: \"kubernetes.io/projected/624ab987-72ff-4c5f-9e6a-acaed1a5b0ee-kube-api-access-xk8x7\") pod \"route-controller-manager-85d4fcb4f-tfmq7\" (UID: \"624ab987-72ff-4c5f-9e6a-acaed1a5b0ee\") " pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-tfmq7" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.955048 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/624ab987-72ff-4c5f-9e6a-acaed1a5b0ee-serving-cert\") pod \"route-controller-manager-85d4fcb4f-tfmq7\" (UID: \"624ab987-72ff-4c5f-9e6a-acaed1a5b0ee\") " pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-tfmq7" Jan 10 16:31:56 crc kubenswrapper[5036]: I0110 16:31:56.955140 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/624ab987-72ff-4c5f-9e6a-acaed1a5b0ee-config\") pod \"route-controller-manager-85d4fcb4f-tfmq7\" (UID: \"624ab987-72ff-4c5f-9e6a-acaed1a5b0ee\") " pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-tfmq7" Jan 10 16:31:57 crc kubenswrapper[5036]: I0110 16:31:57.056569 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/624ab987-72ff-4c5f-9e6a-acaed1a5b0ee-client-ca\") pod \"route-controller-manager-85d4fcb4f-tfmq7\" (UID: \"624ab987-72ff-4c5f-9e6a-acaed1a5b0ee\") " pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-tfmq7" Jan 10 16:31:57 crc kubenswrapper[5036]: I0110 16:31:57.056669 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk8x7\" (UniqueName: \"kubernetes.io/projected/624ab987-72ff-4c5f-9e6a-acaed1a5b0ee-kube-api-access-xk8x7\") pod \"route-controller-manager-85d4fcb4f-tfmq7\" (UID: \"624ab987-72ff-4c5f-9e6a-acaed1a5b0ee\") " pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-tfmq7" Jan 10 16:31:57 crc kubenswrapper[5036]: I0110 16:31:57.056800 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/624ab987-72ff-4c5f-9e6a-acaed1a5b0ee-serving-cert\") pod \"route-controller-manager-85d4fcb4f-tfmq7\" (UID: \"624ab987-72ff-4c5f-9e6a-acaed1a5b0ee\") " pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-tfmq7" Jan 10 16:31:57 crc kubenswrapper[5036]: I0110 16:31:57.056859 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/624ab987-72ff-4c5f-9e6a-acaed1a5b0ee-config\") pod \"route-controller-manager-85d4fcb4f-tfmq7\" (UID: \"624ab987-72ff-4c5f-9e6a-acaed1a5b0ee\") " pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-tfmq7" Jan 10 16:31:57 crc kubenswrapper[5036]: I0110 16:31:57.058309 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/624ab987-72ff-4c5f-9e6a-acaed1a5b0ee-client-ca\") pod \"route-controller-manager-85d4fcb4f-tfmq7\" (UID: \"624ab987-72ff-4c5f-9e6a-acaed1a5b0ee\") " pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-tfmq7" Jan 10 16:31:57 crc kubenswrapper[5036]: I0110 16:31:57.059152 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/624ab987-72ff-4c5f-9e6a-acaed1a5b0ee-config\") pod \"route-controller-manager-85d4fcb4f-tfmq7\" (UID: \"624ab987-72ff-4c5f-9e6a-acaed1a5b0ee\") " pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-tfmq7" Jan 10 16:31:57 crc kubenswrapper[5036]: I0110 16:31:57.065172 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/624ab987-72ff-4c5f-9e6a-acaed1a5b0ee-serving-cert\") pod \"route-controller-manager-85d4fcb4f-tfmq7\" (UID: \"624ab987-72ff-4c5f-9e6a-acaed1a5b0ee\") " pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-tfmq7" Jan 10 16:31:57 crc kubenswrapper[5036]: I0110 16:31:57.087262 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk8x7\" (UniqueName: \"kubernetes.io/projected/624ab987-72ff-4c5f-9e6a-acaed1a5b0ee-kube-api-access-xk8x7\") pod \"route-controller-manager-85d4fcb4f-tfmq7\" (UID: \"624ab987-72ff-4c5f-9e6a-acaed1a5b0ee\") " pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-tfmq7" Jan 10 16:31:57 crc kubenswrapper[5036]: I0110 16:31:57.219802 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-tfmq7" Jan 10 16:31:57 crc kubenswrapper[5036]: I0110 16:31:57.666825 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-85d4fcb4f-tfmq7"] Jan 10 16:31:57 crc kubenswrapper[5036]: I0110 16:31:57.775495 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-tfmq7" event={"ID":"624ab987-72ff-4c5f-9e6a-acaed1a5b0ee","Type":"ContainerStarted","Data":"bc27752f21d9e8eeb16a3ea13274d3f45a723503ef81214cd66d020272403c6a"} Jan 10 16:31:58 crc kubenswrapper[5036]: I0110 16:31:58.514440 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2f5ae2e-140f-4209-91fd-ea887aa999c3" path="/var/lib/kubelet/pods/c2f5ae2e-140f-4209-91fd-ea887aa999c3/volumes" Jan 10 16:31:58 crc kubenswrapper[5036]: I0110 16:31:58.782203 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-tfmq7" event={"ID":"624ab987-72ff-4c5f-9e6a-acaed1a5b0ee","Type":"ContainerStarted","Data":"59204eceeb6d1233b1a2168c4f0965983371a0ad7c564f5ecf2a4f51076ab913"} Jan 10 16:31:58 crc kubenswrapper[5036]: I0110 16:31:58.783084 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-tfmq7" Jan 10 16:31:58 crc kubenswrapper[5036]: I0110 16:31:58.790300 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-tfmq7" Jan 10 16:31:58 crc kubenswrapper[5036]: I0110 16:31:58.806112 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-85d4fcb4f-tfmq7" podStartSLOduration=3.806097582 podStartE2EDuration="3.806097582s" podCreationTimestamp="2026-01-10 16:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:31:58.805091492 +0000 UTC m=+240.675327006" watchObservedRunningTime="2026-01-10 16:31:58.806097582 +0000 UTC m=+240.676333076" Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.439919 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-b4zcj"] Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.440928 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.456269 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-b4zcj"] Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.630174 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68307cc7-c8e2-4f95-bf4b-686d4094f024-registry-tls\") pod \"image-registry-66df7c8f76-b4zcj\" (UID: \"68307cc7-c8e2-4f95-bf4b-686d4094f024\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.630364 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68307cc7-c8e2-4f95-bf4b-686d4094f024-installation-pull-secrets\") pod \"image-registry-66df7c8f76-b4zcj\" (UID: \"68307cc7-c8e2-4f95-bf4b-686d4094f024\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.630448 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-b4zcj\" (UID: \"68307cc7-c8e2-4f95-bf4b-686d4094f024\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.630526 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68307cc7-c8e2-4f95-bf4b-686d4094f024-bound-sa-token\") pod \"image-registry-66df7c8f76-b4zcj\" (UID: \"68307cc7-c8e2-4f95-bf4b-686d4094f024\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.630608 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68307cc7-c8e2-4f95-bf4b-686d4094f024-ca-trust-extracted\") pod \"image-registry-66df7c8f76-b4zcj\" (UID: \"68307cc7-c8e2-4f95-bf4b-686d4094f024\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.630649 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68307cc7-c8e2-4f95-bf4b-686d4094f024-trusted-ca\") pod \"image-registry-66df7c8f76-b4zcj\" (UID: \"68307cc7-c8e2-4f95-bf4b-686d4094f024\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.630711 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68307cc7-c8e2-4f95-bf4b-686d4094f024-registry-certificates\") pod \"image-registry-66df7c8f76-b4zcj\" (UID: \"68307cc7-c8e2-4f95-bf4b-686d4094f024\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.630733 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q4k5\" (UniqueName: \"kubernetes.io/projected/68307cc7-c8e2-4f95-bf4b-686d4094f024-kube-api-access-7q4k5\") pod \"image-registry-66df7c8f76-b4zcj\" (UID: \"68307cc7-c8e2-4f95-bf4b-686d4094f024\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.656854 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-b4zcj\" (UID: \"68307cc7-c8e2-4f95-bf4b-686d4094f024\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.732196 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68307cc7-c8e2-4f95-bf4b-686d4094f024-installation-pull-secrets\") pod \"image-registry-66df7c8f76-b4zcj\" (UID: \"68307cc7-c8e2-4f95-bf4b-686d4094f024\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.732284 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68307cc7-c8e2-4f95-bf4b-686d4094f024-bound-sa-token\") pod \"image-registry-66df7c8f76-b4zcj\" (UID: \"68307cc7-c8e2-4f95-bf4b-686d4094f024\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.732333 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68307cc7-c8e2-4f95-bf4b-686d4094f024-ca-trust-extracted\") pod \"image-registry-66df7c8f76-b4zcj\" (UID: \"68307cc7-c8e2-4f95-bf4b-686d4094f024\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.732361 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68307cc7-c8e2-4f95-bf4b-686d4094f024-trusted-ca\") pod \"image-registry-66df7c8f76-b4zcj\" (UID: \"68307cc7-c8e2-4f95-bf4b-686d4094f024\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.732393 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68307cc7-c8e2-4f95-bf4b-686d4094f024-registry-certificates\") pod \"image-registry-66df7c8f76-b4zcj\" (UID: \"68307cc7-c8e2-4f95-bf4b-686d4094f024\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.732416 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q4k5\" (UniqueName: \"kubernetes.io/projected/68307cc7-c8e2-4f95-bf4b-686d4094f024-kube-api-access-7q4k5\") pod \"image-registry-66df7c8f76-b4zcj\" (UID: \"68307cc7-c8e2-4f95-bf4b-686d4094f024\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.732449 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68307cc7-c8e2-4f95-bf4b-686d4094f024-registry-tls\") pod \"image-registry-66df7c8f76-b4zcj\" (UID: \"68307cc7-c8e2-4f95-bf4b-686d4094f024\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.733044 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68307cc7-c8e2-4f95-bf4b-686d4094f024-ca-trust-extracted\") pod \"image-registry-66df7c8f76-b4zcj\" (UID: \"68307cc7-c8e2-4f95-bf4b-686d4094f024\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.733872 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68307cc7-c8e2-4f95-bf4b-686d4094f024-trusted-ca\") pod \"image-registry-66df7c8f76-b4zcj\" (UID: \"68307cc7-c8e2-4f95-bf4b-686d4094f024\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.734904 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68307cc7-c8e2-4f95-bf4b-686d4094f024-registry-certificates\") pod \"image-registry-66df7c8f76-b4zcj\" (UID: \"68307cc7-c8e2-4f95-bf4b-686d4094f024\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.740395 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68307cc7-c8e2-4f95-bf4b-686d4094f024-installation-pull-secrets\") pod \"image-registry-66df7c8f76-b4zcj\" (UID: \"68307cc7-c8e2-4f95-bf4b-686d4094f024\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.740415 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68307cc7-c8e2-4f95-bf4b-686d4094f024-registry-tls\") pod \"image-registry-66df7c8f76-b4zcj\" (UID: \"68307cc7-c8e2-4f95-bf4b-686d4094f024\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.755285 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68307cc7-c8e2-4f95-bf4b-686d4094f024-bound-sa-token\") pod \"image-registry-66df7c8f76-b4zcj\" (UID: \"68307cc7-c8e2-4f95-bf4b-686d4094f024\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:02 crc kubenswrapper[5036]: I0110 16:32:02.761075 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q4k5\" (UniqueName: \"kubernetes.io/projected/68307cc7-c8e2-4f95-bf4b-686d4094f024-kube-api-access-7q4k5\") pod \"image-registry-66df7c8f76-b4zcj\" (UID: \"68307cc7-c8e2-4f95-bf4b-686d4094f024\") " pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:03 crc kubenswrapper[5036]: I0110 16:32:03.058662 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:03 crc kubenswrapper[5036]: I0110 16:32:03.500277 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-b4zcj"] Jan 10 16:32:03 crc kubenswrapper[5036]: I0110 16:32:03.814031 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" event={"ID":"68307cc7-c8e2-4f95-bf4b-686d4094f024","Type":"ContainerStarted","Data":"f7cc5e32af33eae79e532496f00fc59a97c3c89b066cc7a946590b6b361a8885"} Jan 10 16:32:03 crc kubenswrapper[5036]: I0110 16:32:03.814429 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" event={"ID":"68307cc7-c8e2-4f95-bf4b-686d4094f024","Type":"ContainerStarted","Data":"c133779daaae255ceffb033df16742be3abf94b11955538795dee23d2ec6015d"} Jan 10 16:32:03 crc kubenswrapper[5036]: I0110 16:32:03.814491 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:03 crc kubenswrapper[5036]: I0110 16:32:03.854623 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" podStartSLOduration=1.854592871 podStartE2EDuration="1.854592871s" podCreationTimestamp="2026-01-10 16:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:32:03.847607202 +0000 UTC m=+245.717842696" watchObservedRunningTime="2026-01-10 16:32:03.854592871 +0000 UTC m=+245.724828405" Jan 10 16:32:07 crc kubenswrapper[5036]: I0110 16:32:07.553055 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-khg2q"] Jan 10 16:32:07 crc kubenswrapper[5036]: I0110 16:32:07.554354 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khg2q" Jan 10 16:32:07 crc kubenswrapper[5036]: I0110 16:32:07.556450 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 10 16:32:07 crc kubenswrapper[5036]: I0110 16:32:07.568949 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-khg2q"] Jan 10 16:32:07 crc kubenswrapper[5036]: I0110 16:32:07.714572 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d18f574b-ac33-4b60-bcbc-856b463b231a-utilities\") pod \"community-operators-khg2q\" (UID: \"d18f574b-ac33-4b60-bcbc-856b463b231a\") " pod="openshift-marketplace/community-operators-khg2q" Jan 10 16:32:07 crc kubenswrapper[5036]: I0110 16:32:07.714914 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d18f574b-ac33-4b60-bcbc-856b463b231a-catalog-content\") pod \"community-operators-khg2q\" (UID: \"d18f574b-ac33-4b60-bcbc-856b463b231a\") " pod="openshift-marketplace/community-operators-khg2q" Jan 10 16:32:07 crc kubenswrapper[5036]: I0110 16:32:07.715015 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtzkx\" (UniqueName: \"kubernetes.io/projected/d18f574b-ac33-4b60-bcbc-856b463b231a-kube-api-access-xtzkx\") pod \"community-operators-khg2q\" (UID: \"d18f574b-ac33-4b60-bcbc-856b463b231a\") " pod="openshift-marketplace/community-operators-khg2q" Jan 10 16:32:07 crc kubenswrapper[5036]: I0110 16:32:07.741470 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vdpgl"] Jan 10 16:32:07 crc kubenswrapper[5036]: I0110 16:32:07.742852 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vdpgl" Jan 10 16:32:07 crc kubenswrapper[5036]: I0110 16:32:07.745155 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 10 16:32:07 crc kubenswrapper[5036]: I0110 16:32:07.757207 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vdpgl"] Jan 10 16:32:07 crc kubenswrapper[5036]: I0110 16:32:07.816620 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d18f574b-ac33-4b60-bcbc-856b463b231a-utilities\") pod \"community-operators-khg2q\" (UID: \"d18f574b-ac33-4b60-bcbc-856b463b231a\") " pod="openshift-marketplace/community-operators-khg2q" Jan 10 16:32:07 crc kubenswrapper[5036]: I0110 16:32:07.816738 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d18f574b-ac33-4b60-bcbc-856b463b231a-catalog-content\") pod \"community-operators-khg2q\" (UID: \"d18f574b-ac33-4b60-bcbc-856b463b231a\") " pod="openshift-marketplace/community-operators-khg2q" Jan 10 16:32:07 crc kubenswrapper[5036]: I0110 16:32:07.816775 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtzkx\" (UniqueName: \"kubernetes.io/projected/d18f574b-ac33-4b60-bcbc-856b463b231a-kube-api-access-xtzkx\") pod \"community-operators-khg2q\" (UID: \"d18f574b-ac33-4b60-bcbc-856b463b231a\") " pod="openshift-marketplace/community-operators-khg2q" Jan 10 16:32:07 crc kubenswrapper[5036]: I0110 16:32:07.817456 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d18f574b-ac33-4b60-bcbc-856b463b231a-utilities\") pod \"community-operators-khg2q\" (UID: \"d18f574b-ac33-4b60-bcbc-856b463b231a\") " pod="openshift-marketplace/community-operators-khg2q" Jan 10 16:32:07 crc kubenswrapper[5036]: I0110 16:32:07.817673 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d18f574b-ac33-4b60-bcbc-856b463b231a-catalog-content\") pod \"community-operators-khg2q\" (UID: \"d18f574b-ac33-4b60-bcbc-856b463b231a\") " pod="openshift-marketplace/community-operators-khg2q" Jan 10 16:32:07 crc kubenswrapper[5036]: I0110 16:32:07.834875 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtzkx\" (UniqueName: \"kubernetes.io/projected/d18f574b-ac33-4b60-bcbc-856b463b231a-kube-api-access-xtzkx\") pod \"community-operators-khg2q\" (UID: \"d18f574b-ac33-4b60-bcbc-856b463b231a\") " pod="openshift-marketplace/community-operators-khg2q" Jan 10 16:32:07 crc kubenswrapper[5036]: I0110 16:32:07.880046 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khg2q" Jan 10 16:32:07 crc kubenswrapper[5036]: I0110 16:32:07.918311 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91cf1499-408e-4bc3-b3ae-8f435079b904-catalog-content\") pod \"redhat-operators-vdpgl\" (UID: \"91cf1499-408e-4bc3-b3ae-8f435079b904\") " pod="openshift-marketplace/redhat-operators-vdpgl" Jan 10 16:32:07 crc kubenswrapper[5036]: I0110 16:32:07.918526 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91cf1499-408e-4bc3-b3ae-8f435079b904-utilities\") pod \"redhat-operators-vdpgl\" (UID: \"91cf1499-408e-4bc3-b3ae-8f435079b904\") " pod="openshift-marketplace/redhat-operators-vdpgl" Jan 10 16:32:07 crc kubenswrapper[5036]: I0110 16:32:07.918658 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7556\" (UniqueName: \"kubernetes.io/projected/91cf1499-408e-4bc3-b3ae-8f435079b904-kube-api-access-c7556\") pod \"redhat-operators-vdpgl\" (UID: \"91cf1499-408e-4bc3-b3ae-8f435079b904\") " pod="openshift-marketplace/redhat-operators-vdpgl" Jan 10 16:32:08 crc kubenswrapper[5036]: I0110 16:32:08.019995 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91cf1499-408e-4bc3-b3ae-8f435079b904-catalog-content\") pod \"redhat-operators-vdpgl\" (UID: \"91cf1499-408e-4bc3-b3ae-8f435079b904\") " pod="openshift-marketplace/redhat-operators-vdpgl" Jan 10 16:32:08 crc kubenswrapper[5036]: I0110 16:32:08.020049 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91cf1499-408e-4bc3-b3ae-8f435079b904-utilities\") pod \"redhat-operators-vdpgl\" (UID: \"91cf1499-408e-4bc3-b3ae-8f435079b904\") " pod="openshift-marketplace/redhat-operators-vdpgl" Jan 10 16:32:08 crc kubenswrapper[5036]: I0110 16:32:08.020075 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7556\" (UniqueName: \"kubernetes.io/projected/91cf1499-408e-4bc3-b3ae-8f435079b904-kube-api-access-c7556\") pod \"redhat-operators-vdpgl\" (UID: \"91cf1499-408e-4bc3-b3ae-8f435079b904\") " pod="openshift-marketplace/redhat-operators-vdpgl" Jan 10 16:32:08 crc kubenswrapper[5036]: I0110 16:32:08.020588 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91cf1499-408e-4bc3-b3ae-8f435079b904-utilities\") pod \"redhat-operators-vdpgl\" (UID: \"91cf1499-408e-4bc3-b3ae-8f435079b904\") " pod="openshift-marketplace/redhat-operators-vdpgl" Jan 10 16:32:08 crc kubenswrapper[5036]: I0110 16:32:08.020694 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91cf1499-408e-4bc3-b3ae-8f435079b904-catalog-content\") pod \"redhat-operators-vdpgl\" (UID: \"91cf1499-408e-4bc3-b3ae-8f435079b904\") " pod="openshift-marketplace/redhat-operators-vdpgl" Jan 10 16:32:08 crc kubenswrapper[5036]: I0110 16:32:08.046714 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7556\" (UniqueName: \"kubernetes.io/projected/91cf1499-408e-4bc3-b3ae-8f435079b904-kube-api-access-c7556\") pod \"redhat-operators-vdpgl\" (UID: \"91cf1499-408e-4bc3-b3ae-8f435079b904\") " pod="openshift-marketplace/redhat-operators-vdpgl" Jan 10 16:32:08 crc kubenswrapper[5036]: I0110 16:32:08.058588 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vdpgl" Jan 10 16:32:08 crc kubenswrapper[5036]: I0110 16:32:08.079328 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-khg2q"] Jan 10 16:32:08 crc kubenswrapper[5036]: W0110 16:32:08.089748 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd18f574b_ac33_4b60_bcbc_856b463b231a.slice/crio-97731376cac86a8514883e98a2e9f410e8e29dad65262b4b376a3e8b42e71a35 WatchSource:0}: Error finding container 97731376cac86a8514883e98a2e9f410e8e29dad65262b4b376a3e8b42e71a35: Status 404 returned error can't find the container with id 97731376cac86a8514883e98a2e9f410e8e29dad65262b4b376a3e8b42e71a35 Jan 10 16:32:08 crc kubenswrapper[5036]: I0110 16:32:08.270811 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vdpgl"] Jan 10 16:32:08 crc kubenswrapper[5036]: I0110 16:32:08.847180 5036 generic.go:334] "Generic (PLEG): container finished" podID="d18f574b-ac33-4b60-bcbc-856b463b231a" containerID="7fbe03e599d712764991b03c8fb13c64084880ca8b7fe3a79bc37388a47c29a9" exitCode=0 Jan 10 16:32:08 crc kubenswrapper[5036]: I0110 16:32:08.847390 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khg2q" event={"ID":"d18f574b-ac33-4b60-bcbc-856b463b231a","Type":"ContainerDied","Data":"7fbe03e599d712764991b03c8fb13c64084880ca8b7fe3a79bc37388a47c29a9"} Jan 10 16:32:08 crc kubenswrapper[5036]: I0110 16:32:08.847539 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khg2q" event={"ID":"d18f574b-ac33-4b60-bcbc-856b463b231a","Type":"ContainerStarted","Data":"97731376cac86a8514883e98a2e9f410e8e29dad65262b4b376a3e8b42e71a35"} Jan 10 16:32:08 crc kubenswrapper[5036]: I0110 16:32:08.849478 5036 generic.go:334] "Generic (PLEG): container finished" podID="91cf1499-408e-4bc3-b3ae-8f435079b904" containerID="2e32ff1782cec6c2b7537a0bc4af8fbdc03127def613044ae0041ebc16940e22" exitCode=0 Jan 10 16:32:08 crc kubenswrapper[5036]: I0110 16:32:08.849506 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdpgl" event={"ID":"91cf1499-408e-4bc3-b3ae-8f435079b904","Type":"ContainerDied","Data":"2e32ff1782cec6c2b7537a0bc4af8fbdc03127def613044ae0041ebc16940e22"} Jan 10 16:32:08 crc kubenswrapper[5036]: I0110 16:32:08.849521 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdpgl" event={"ID":"91cf1499-408e-4bc3-b3ae-8f435079b904","Type":"ContainerStarted","Data":"36bed241f99a545fdf99570048121103c9f56dda2782c174e47266e11e46b352"} Jan 10 16:32:09 crc kubenswrapper[5036]: I0110 16:32:09.858646 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdpgl" event={"ID":"91cf1499-408e-4bc3-b3ae-8f435079b904","Type":"ContainerStarted","Data":"fd2daa6081a2d525543e9714dfff84cca6504eda1cc1d4b3538bdf2f2265084b"} Jan 10 16:32:09 crc kubenswrapper[5036]: I0110 16:32:09.943747 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nfqrz"] Jan 10 16:32:09 crc kubenswrapper[5036]: I0110 16:32:09.945055 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nfqrz" Jan 10 16:32:09 crc kubenswrapper[5036]: I0110 16:32:09.953829 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 10 16:32:09 crc kubenswrapper[5036]: I0110 16:32:09.966496 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nfqrz"] Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.047703 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3195b346-8a73-4e01-9842-5a7fde228f6e-catalog-content\") pod \"certified-operators-nfqrz\" (UID: \"3195b346-8a73-4e01-9842-5a7fde228f6e\") " pod="openshift-marketplace/certified-operators-nfqrz" Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.047758 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3195b346-8a73-4e01-9842-5a7fde228f6e-utilities\") pod \"certified-operators-nfqrz\" (UID: \"3195b346-8a73-4e01-9842-5a7fde228f6e\") " pod="openshift-marketplace/certified-operators-nfqrz" Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.047784 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ggf5\" (UniqueName: \"kubernetes.io/projected/3195b346-8a73-4e01-9842-5a7fde228f6e-kube-api-access-4ggf5\") pod \"certified-operators-nfqrz\" (UID: \"3195b346-8a73-4e01-9842-5a7fde228f6e\") " pod="openshift-marketplace/certified-operators-nfqrz" Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.147722 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-28lcz"] Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.149349 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ggf5\" (UniqueName: \"kubernetes.io/projected/3195b346-8a73-4e01-9842-5a7fde228f6e-kube-api-access-4ggf5\") pod \"certified-operators-nfqrz\" (UID: \"3195b346-8a73-4e01-9842-5a7fde228f6e\") " pod="openshift-marketplace/certified-operators-nfqrz" Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.149514 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3195b346-8a73-4e01-9842-5a7fde228f6e-catalog-content\") pod \"certified-operators-nfqrz\" (UID: \"3195b346-8a73-4e01-9842-5a7fde228f6e\") " pod="openshift-marketplace/certified-operators-nfqrz" Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.149561 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3195b346-8a73-4e01-9842-5a7fde228f6e-utilities\") pod \"certified-operators-nfqrz\" (UID: \"3195b346-8a73-4e01-9842-5a7fde228f6e\") " pod="openshift-marketplace/certified-operators-nfqrz" Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.149360 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28lcz" Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.150246 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3195b346-8a73-4e01-9842-5a7fde228f6e-utilities\") pod \"certified-operators-nfqrz\" (UID: \"3195b346-8a73-4e01-9842-5a7fde228f6e\") " pod="openshift-marketplace/certified-operators-nfqrz" Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.150586 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3195b346-8a73-4e01-9842-5a7fde228f6e-catalog-content\") pod \"certified-operators-nfqrz\" (UID: \"3195b346-8a73-4e01-9842-5a7fde228f6e\") " pod="openshift-marketplace/certified-operators-nfqrz" Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.153209 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.161764 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-28lcz"] Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.179569 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ggf5\" (UniqueName: \"kubernetes.io/projected/3195b346-8a73-4e01-9842-5a7fde228f6e-kube-api-access-4ggf5\") pod \"certified-operators-nfqrz\" (UID: \"3195b346-8a73-4e01-9842-5a7fde228f6e\") " pod="openshift-marketplace/certified-operators-nfqrz" Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.250694 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd0fc5aa-292a-4009-8cdf-0534293491f3-catalog-content\") pod \"redhat-marketplace-28lcz\" (UID: \"dd0fc5aa-292a-4009-8cdf-0534293491f3\") " pod="openshift-marketplace/redhat-marketplace-28lcz" Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.250750 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd0fc5aa-292a-4009-8cdf-0534293491f3-utilities\") pod \"redhat-marketplace-28lcz\" (UID: \"dd0fc5aa-292a-4009-8cdf-0534293491f3\") " pod="openshift-marketplace/redhat-marketplace-28lcz" Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.250798 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68842\" (UniqueName: \"kubernetes.io/projected/dd0fc5aa-292a-4009-8cdf-0534293491f3-kube-api-access-68842\") pod \"redhat-marketplace-28lcz\" (UID: \"dd0fc5aa-292a-4009-8cdf-0534293491f3\") " pod="openshift-marketplace/redhat-marketplace-28lcz" Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.293869 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nfqrz" Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.352487 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd0fc5aa-292a-4009-8cdf-0534293491f3-catalog-content\") pod \"redhat-marketplace-28lcz\" (UID: \"dd0fc5aa-292a-4009-8cdf-0534293491f3\") " pod="openshift-marketplace/redhat-marketplace-28lcz" Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.352797 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd0fc5aa-292a-4009-8cdf-0534293491f3-utilities\") pod \"redhat-marketplace-28lcz\" (UID: \"dd0fc5aa-292a-4009-8cdf-0534293491f3\") " pod="openshift-marketplace/redhat-marketplace-28lcz" Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.352998 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68842\" (UniqueName: \"kubernetes.io/projected/dd0fc5aa-292a-4009-8cdf-0534293491f3-kube-api-access-68842\") pod \"redhat-marketplace-28lcz\" (UID: \"dd0fc5aa-292a-4009-8cdf-0534293491f3\") " pod="openshift-marketplace/redhat-marketplace-28lcz" Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.353140 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd0fc5aa-292a-4009-8cdf-0534293491f3-catalog-content\") pod \"redhat-marketplace-28lcz\" (UID: \"dd0fc5aa-292a-4009-8cdf-0534293491f3\") " pod="openshift-marketplace/redhat-marketplace-28lcz" Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.353477 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd0fc5aa-292a-4009-8cdf-0534293491f3-utilities\") pod \"redhat-marketplace-28lcz\" (UID: \"dd0fc5aa-292a-4009-8cdf-0534293491f3\") " pod="openshift-marketplace/redhat-marketplace-28lcz" Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.374156 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68842\" (UniqueName: \"kubernetes.io/projected/dd0fc5aa-292a-4009-8cdf-0534293491f3-kube-api-access-68842\") pod \"redhat-marketplace-28lcz\" (UID: \"dd0fc5aa-292a-4009-8cdf-0534293491f3\") " pod="openshift-marketplace/redhat-marketplace-28lcz" Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.476538 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-28lcz" Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.701992 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nfqrz"] Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.867811 5036 generic.go:334] "Generic (PLEG): container finished" podID="3195b346-8a73-4e01-9842-5a7fde228f6e" containerID="f26c0eacba47e69726c83b85b164bd61e2b5080bce8fabdccc2a7bd7b54b73c1" exitCode=0 Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.867905 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfqrz" event={"ID":"3195b346-8a73-4e01-9842-5a7fde228f6e","Type":"ContainerDied","Data":"f26c0eacba47e69726c83b85b164bd61e2b5080bce8fabdccc2a7bd7b54b73c1"} Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.867947 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfqrz" event={"ID":"3195b346-8a73-4e01-9842-5a7fde228f6e","Type":"ContainerStarted","Data":"6a09b1556451aaa3d80a400984598ea3e86459d3a4b685e2218c2100b6896d0b"} Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.871735 5036 generic.go:334] "Generic (PLEG): container finished" podID="d18f574b-ac33-4b60-bcbc-856b463b231a" containerID="8d1e9b0496f741d5da0a67aaa5ae5ab249038f6d00c0437c3ba10eaf62d39c94" exitCode=0 Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.871858 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khg2q" event={"ID":"d18f574b-ac33-4b60-bcbc-856b463b231a","Type":"ContainerDied","Data":"8d1e9b0496f741d5da0a67aaa5ae5ab249038f6d00c0437c3ba10eaf62d39c94"} Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.876036 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-28lcz"] Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.878856 5036 generic.go:334] "Generic (PLEG): container finished" podID="91cf1499-408e-4bc3-b3ae-8f435079b904" containerID="fd2daa6081a2d525543e9714dfff84cca6504eda1cc1d4b3538bdf2f2265084b" exitCode=0 Jan 10 16:32:10 crc kubenswrapper[5036]: I0110 16:32:10.878902 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdpgl" event={"ID":"91cf1499-408e-4bc3-b3ae-8f435079b904","Type":"ContainerDied","Data":"fd2daa6081a2d525543e9714dfff84cca6504eda1cc1d4b3538bdf2f2265084b"} Jan 10 16:32:10 crc kubenswrapper[5036]: W0110 16:32:10.911667 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd0fc5aa_292a_4009_8cdf_0534293491f3.slice/crio-823826c9f1885cc1990e0517be56ac5bd2d27ebb8900da01c41eadd3acf15491 WatchSource:0}: Error finding container 823826c9f1885cc1990e0517be56ac5bd2d27ebb8900da01c41eadd3acf15491: Status 404 returned error can't find the container with id 823826c9f1885cc1990e0517be56ac5bd2d27ebb8900da01c41eadd3acf15491 Jan 10 16:32:11 crc kubenswrapper[5036]: I0110 16:32:11.885800 5036 generic.go:334] "Generic (PLEG): container finished" podID="dd0fc5aa-292a-4009-8cdf-0534293491f3" containerID="01c127dedafa459a7e34101e0a64cf272dc3b2d1b7f270494e68e86777621dfa" exitCode=0 Jan 10 16:32:11 crc kubenswrapper[5036]: I0110 16:32:11.885995 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28lcz" event={"ID":"dd0fc5aa-292a-4009-8cdf-0534293491f3","Type":"ContainerDied","Data":"01c127dedafa459a7e34101e0a64cf272dc3b2d1b7f270494e68e86777621dfa"} Jan 10 16:32:11 crc kubenswrapper[5036]: I0110 16:32:11.886068 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28lcz" event={"ID":"dd0fc5aa-292a-4009-8cdf-0534293491f3","Type":"ContainerStarted","Data":"823826c9f1885cc1990e0517be56ac5bd2d27ebb8900da01c41eadd3acf15491"} Jan 10 16:32:11 crc kubenswrapper[5036]: I0110 16:32:11.891470 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khg2q" event={"ID":"d18f574b-ac33-4b60-bcbc-856b463b231a","Type":"ContainerStarted","Data":"2ebe42a32e88889656a108f40536dd524bd290a8c435c4594bfe7e341ed45ce8"} Jan 10 16:32:11 crc kubenswrapper[5036]: I0110 16:32:11.894286 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdpgl" event={"ID":"91cf1499-408e-4bc3-b3ae-8f435079b904","Type":"ContainerStarted","Data":"63b09f3d2ff0ca71285135c947d8e267ecde8a0910873c7c80507837a93d0e92"} Jan 10 16:32:11 crc kubenswrapper[5036]: I0110 16:32:11.896177 5036 generic.go:334] "Generic (PLEG): container finished" podID="3195b346-8a73-4e01-9842-5a7fde228f6e" containerID="353385e4a9a71bf17955f3dce6d3cd6d98c0359168840bfc0b59e790b243e1e3" exitCode=0 Jan 10 16:32:11 crc kubenswrapper[5036]: I0110 16:32:11.896219 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfqrz" event={"ID":"3195b346-8a73-4e01-9842-5a7fde228f6e","Type":"ContainerDied","Data":"353385e4a9a71bf17955f3dce6d3cd6d98c0359168840bfc0b59e790b243e1e3"} Jan 10 16:32:11 crc kubenswrapper[5036]: I0110 16:32:11.923871 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-khg2q" podStartSLOduration=2.439624055 podStartE2EDuration="4.923850273s" podCreationTimestamp="2026-01-10 16:32:07 +0000 UTC" firstStartedPulling="2026-01-10 16:32:08.849894552 +0000 UTC m=+250.720130056" lastFinishedPulling="2026-01-10 16:32:11.33412078 +0000 UTC m=+253.204356274" observedRunningTime="2026-01-10 16:32:11.918788482 +0000 UTC m=+253.789024016" watchObservedRunningTime="2026-01-10 16:32:11.923850273 +0000 UTC m=+253.794085767" Jan 10 16:32:11 crc kubenswrapper[5036]: I0110 16:32:11.952161 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vdpgl" podStartSLOduration=2.462094637 podStartE2EDuration="4.952124999s" podCreationTimestamp="2026-01-10 16:32:07 +0000 UTC" firstStartedPulling="2026-01-10 16:32:08.853005535 +0000 UTC m=+250.723241049" lastFinishedPulling="2026-01-10 16:32:11.343035917 +0000 UTC m=+253.213271411" observedRunningTime="2026-01-10 16:32:11.94347791 +0000 UTC m=+253.813713414" watchObservedRunningTime="2026-01-10 16:32:11.952124999 +0000 UTC m=+253.822360533" Jan 10 16:32:12 crc kubenswrapper[5036]: I0110 16:32:12.905247 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfqrz" event={"ID":"3195b346-8a73-4e01-9842-5a7fde228f6e","Type":"ContainerStarted","Data":"8a3a35a728575a1959adb81a6d008ed7339573ee62ccd42fdf0cdb857c7cdb6e"} Jan 10 16:32:12 crc kubenswrapper[5036]: I0110 16:32:12.906783 5036 generic.go:334] "Generic (PLEG): container finished" podID="dd0fc5aa-292a-4009-8cdf-0534293491f3" containerID="e7855d5ec709722118c6d4259910e067c261b06a93f0ad56e04d164765364443" exitCode=0 Jan 10 16:32:12 crc kubenswrapper[5036]: I0110 16:32:12.906866 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28lcz" event={"ID":"dd0fc5aa-292a-4009-8cdf-0534293491f3","Type":"ContainerDied","Data":"e7855d5ec709722118c6d4259910e067c261b06a93f0ad56e04d164765364443"} Jan 10 16:32:12 crc kubenswrapper[5036]: I0110 16:32:12.940782 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nfqrz" podStartSLOduration=2.409071471 podStartE2EDuration="3.940765909s" podCreationTimestamp="2026-01-10 16:32:09 +0000 UTC" firstStartedPulling="2026-01-10 16:32:10.876633291 +0000 UTC m=+252.746868795" lastFinishedPulling="2026-01-10 16:32:12.408327739 +0000 UTC m=+254.278563233" observedRunningTime="2026-01-10 16:32:12.939204303 +0000 UTC m=+254.809439807" watchObservedRunningTime="2026-01-10 16:32:12.940765909 +0000 UTC m=+254.811001403" Jan 10 16:32:13 crc kubenswrapper[5036]: I0110 16:32:13.915832 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-28lcz" event={"ID":"dd0fc5aa-292a-4009-8cdf-0534293491f3","Type":"ContainerStarted","Data":"c5b2328b8ff9072c92638c8a71829bb71cc738ab11db79bfa9d42b02576d6013"} Jan 10 16:32:13 crc kubenswrapper[5036]: I0110 16:32:13.935915 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-28lcz" podStartSLOduration=2.372730764 podStartE2EDuration="3.935896433s" podCreationTimestamp="2026-01-10 16:32:10 +0000 UTC" firstStartedPulling="2026-01-10 16:32:11.888231848 +0000 UTC m=+253.758467342" lastFinishedPulling="2026-01-10 16:32:13.451397497 +0000 UTC m=+255.321633011" observedRunningTime="2026-01-10 16:32:13.932721428 +0000 UTC m=+255.802956922" watchObservedRunningTime="2026-01-10 16:32:13.935896433 +0000 UTC m=+255.806131927" Jan 10 16:32:15 crc kubenswrapper[5036]: I0110 16:32:15.546104 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78b59b8d64-vnmf7"] Jan 10 16:32:15 crc kubenswrapper[5036]: I0110 16:32:15.546631 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" podUID="587a465d-a421-4b2c-834f-9640d82b1a6f" containerName="controller-manager" containerID="cri-o://57da2a26c88ab17c84ee530c269548bc00745fde4fad8e450622e6bf6a76f96b" gracePeriod=30 Jan 10 16:32:17 crc kubenswrapper[5036]: I0110 16:32:17.226122 5036 patch_prober.go:28] interesting pod/controller-manager-78b59b8d64-vnmf7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": dial tcp 10.217.0.68:8443: connect: connection refused" start-of-body= Jan 10 16:32:17 crc kubenswrapper[5036]: I0110 16:32:17.226183 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" podUID="587a465d-a421-4b2c-834f-9640d82b1a6f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": dial tcp 10.217.0.68:8443: connect: connection refused" Jan 10 16:32:17 crc kubenswrapper[5036]: I0110 16:32:17.881122 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-khg2q" Jan 10 16:32:17 crc kubenswrapper[5036]: I0110 16:32:17.881209 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-khg2q" Jan 10 16:32:17 crc kubenswrapper[5036]: I0110 16:32:17.934857 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-khg2q" Jan 10 16:32:17 crc kubenswrapper[5036]: I0110 16:32:17.943244 5036 generic.go:334] "Generic (PLEG): container finished" podID="587a465d-a421-4b2c-834f-9640d82b1a6f" containerID="57da2a26c88ab17c84ee530c269548bc00745fde4fad8e450622e6bf6a76f96b" exitCode=0 Jan 10 16:32:17 crc kubenswrapper[5036]: I0110 16:32:17.944313 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" event={"ID":"587a465d-a421-4b2c-834f-9640d82b1a6f","Type":"ContainerDied","Data":"57da2a26c88ab17c84ee530c269548bc00745fde4fad8e450622e6bf6a76f96b"} Jan 10 16:32:17 crc kubenswrapper[5036]: I0110 16:32:17.980575 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-khg2q" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.059387 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vdpgl" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.059456 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vdpgl" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.108156 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vdpgl" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.745933 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.782016 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-774c5dd755-kvvsh"] Jan 10 16:32:18 crc kubenswrapper[5036]: E0110 16:32:18.782372 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="587a465d-a421-4b2c-834f-9640d82b1a6f" containerName="controller-manager" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.782390 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="587a465d-a421-4b2c-834f-9640d82b1a6f" containerName="controller-manager" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.782552 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="587a465d-a421-4b2c-834f-9640d82b1a6f" containerName="controller-manager" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.783077 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774c5dd755-kvvsh" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.791466 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-774c5dd755-kvvsh"] Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.863814 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/587a465d-a421-4b2c-834f-9640d82b1a6f-serving-cert\") pod \"587a465d-a421-4b2c-834f-9640d82b1a6f\" (UID: \"587a465d-a421-4b2c-834f-9640d82b1a6f\") " Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.863957 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swzwm\" (UniqueName: \"kubernetes.io/projected/587a465d-a421-4b2c-834f-9640d82b1a6f-kube-api-access-swzwm\") pod \"587a465d-a421-4b2c-834f-9640d82b1a6f\" (UID: \"587a465d-a421-4b2c-834f-9640d82b1a6f\") " Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.864000 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/587a465d-a421-4b2c-834f-9640d82b1a6f-proxy-ca-bundles\") pod \"587a465d-a421-4b2c-834f-9640d82b1a6f\" (UID: \"587a465d-a421-4b2c-834f-9640d82b1a6f\") " Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.864025 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/587a465d-a421-4b2c-834f-9640d82b1a6f-client-ca\") pod \"587a465d-a421-4b2c-834f-9640d82b1a6f\" (UID: \"587a465d-a421-4b2c-834f-9640d82b1a6f\") " Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.864085 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/587a465d-a421-4b2c-834f-9640d82b1a6f-config\") pod \"587a465d-a421-4b2c-834f-9640d82b1a6f\" (UID: \"587a465d-a421-4b2c-834f-9640d82b1a6f\") " Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.864290 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f856229-42c0-4fd9-b694-5e0da37fae66-client-ca\") pod \"controller-manager-774c5dd755-kvvsh\" (UID: \"3f856229-42c0-4fd9-b694-5e0da37fae66\") " pod="openshift-controller-manager/controller-manager-774c5dd755-kvvsh" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.864335 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f856229-42c0-4fd9-b694-5e0da37fae66-serving-cert\") pod \"controller-manager-774c5dd755-kvvsh\" (UID: \"3f856229-42c0-4fd9-b694-5e0da37fae66\") " pod="openshift-controller-manager/controller-manager-774c5dd755-kvvsh" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.864364 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzl5n\" (UniqueName: \"kubernetes.io/projected/3f856229-42c0-4fd9-b694-5e0da37fae66-kube-api-access-nzl5n\") pod \"controller-manager-774c5dd755-kvvsh\" (UID: \"3f856229-42c0-4fd9-b694-5e0da37fae66\") " pod="openshift-controller-manager/controller-manager-774c5dd755-kvvsh" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.864612 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f856229-42c0-4fd9-b694-5e0da37fae66-config\") pod \"controller-manager-774c5dd755-kvvsh\" (UID: \"3f856229-42c0-4fd9-b694-5e0da37fae66\") " pod="openshift-controller-manager/controller-manager-774c5dd755-kvvsh" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.864753 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f856229-42c0-4fd9-b694-5e0da37fae66-proxy-ca-bundles\") pod \"controller-manager-774c5dd755-kvvsh\" (UID: \"3f856229-42c0-4fd9-b694-5e0da37fae66\") " pod="openshift-controller-manager/controller-manager-774c5dd755-kvvsh" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.865010 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/587a465d-a421-4b2c-834f-9640d82b1a6f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "587a465d-a421-4b2c-834f-9640d82b1a6f" (UID: "587a465d-a421-4b2c-834f-9640d82b1a6f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.865037 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/587a465d-a421-4b2c-834f-9640d82b1a6f-config" (OuterVolumeSpecName: "config") pod "587a465d-a421-4b2c-834f-9640d82b1a6f" (UID: "587a465d-a421-4b2c-834f-9640d82b1a6f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.864998 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/587a465d-a421-4b2c-834f-9640d82b1a6f-client-ca" (OuterVolumeSpecName: "client-ca") pod "587a465d-a421-4b2c-834f-9640d82b1a6f" (UID: "587a465d-a421-4b2c-834f-9640d82b1a6f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.870262 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/587a465d-a421-4b2c-834f-9640d82b1a6f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "587a465d-a421-4b2c-834f-9640d82b1a6f" (UID: "587a465d-a421-4b2c-834f-9640d82b1a6f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.879998 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/587a465d-a421-4b2c-834f-9640d82b1a6f-kube-api-access-swzwm" (OuterVolumeSpecName: "kube-api-access-swzwm") pod "587a465d-a421-4b2c-834f-9640d82b1a6f" (UID: "587a465d-a421-4b2c-834f-9640d82b1a6f"). InnerVolumeSpecName "kube-api-access-swzwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.952394 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.952841 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78b59b8d64-vnmf7" event={"ID":"587a465d-a421-4b2c-834f-9640d82b1a6f","Type":"ContainerDied","Data":"537630ad32e06212bbbb2ebdda8070dc8cb01b2ed0dfcf32757e43794ca36b38"} Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.953039 5036 scope.go:117] "RemoveContainer" containerID="57da2a26c88ab17c84ee530c269548bc00745fde4fad8e450622e6bf6a76f96b" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.965961 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f856229-42c0-4fd9-b694-5e0da37fae66-client-ca\") pod \"controller-manager-774c5dd755-kvvsh\" (UID: \"3f856229-42c0-4fd9-b694-5e0da37fae66\") " pod="openshift-controller-manager/controller-manager-774c5dd755-kvvsh" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.966006 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f856229-42c0-4fd9-b694-5e0da37fae66-serving-cert\") pod \"controller-manager-774c5dd755-kvvsh\" (UID: \"3f856229-42c0-4fd9-b694-5e0da37fae66\") " pod="openshift-controller-manager/controller-manager-774c5dd755-kvvsh" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.966033 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzl5n\" (UniqueName: \"kubernetes.io/projected/3f856229-42c0-4fd9-b694-5e0da37fae66-kube-api-access-nzl5n\") pod \"controller-manager-774c5dd755-kvvsh\" (UID: \"3f856229-42c0-4fd9-b694-5e0da37fae66\") " pod="openshift-controller-manager/controller-manager-774c5dd755-kvvsh" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.966077 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f856229-42c0-4fd9-b694-5e0da37fae66-config\") pod \"controller-manager-774c5dd755-kvvsh\" (UID: \"3f856229-42c0-4fd9-b694-5e0da37fae66\") " pod="openshift-controller-manager/controller-manager-774c5dd755-kvvsh" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.966095 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f856229-42c0-4fd9-b694-5e0da37fae66-proxy-ca-bundles\") pod \"controller-manager-774c5dd755-kvvsh\" (UID: \"3f856229-42c0-4fd9-b694-5e0da37fae66\") " pod="openshift-controller-manager/controller-manager-774c5dd755-kvvsh" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.966154 5036 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/587a465d-a421-4b2c-834f-9640d82b1a6f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.966167 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swzwm\" (UniqueName: \"kubernetes.io/projected/587a465d-a421-4b2c-834f-9640d82b1a6f-kube-api-access-swzwm\") on node \"crc\" DevicePath \"\"" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.966178 5036 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/587a465d-a421-4b2c-834f-9640d82b1a6f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.966187 5036 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/587a465d-a421-4b2c-834f-9640d82b1a6f-client-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.966196 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/587a465d-a421-4b2c-834f-9640d82b1a6f-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.967813 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f856229-42c0-4fd9-b694-5e0da37fae66-client-ca\") pod \"controller-manager-774c5dd755-kvvsh\" (UID: \"3f856229-42c0-4fd9-b694-5e0da37fae66\") " pod="openshift-controller-manager/controller-manager-774c5dd755-kvvsh" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.967904 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f856229-42c0-4fd9-b694-5e0da37fae66-proxy-ca-bundles\") pod \"controller-manager-774c5dd755-kvvsh\" (UID: \"3f856229-42c0-4fd9-b694-5e0da37fae66\") " pod="openshift-controller-manager/controller-manager-774c5dd755-kvvsh" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.968044 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f856229-42c0-4fd9-b694-5e0da37fae66-config\") pod \"controller-manager-774c5dd755-kvvsh\" (UID: \"3f856229-42c0-4fd9-b694-5e0da37fae66\") " pod="openshift-controller-manager/controller-manager-774c5dd755-kvvsh" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.973128 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f856229-42c0-4fd9-b694-5e0da37fae66-serving-cert\") pod \"controller-manager-774c5dd755-kvvsh\" (UID: \"3f856229-42c0-4fd9-b694-5e0da37fae66\") " pod="openshift-controller-manager/controller-manager-774c5dd755-kvvsh" Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.994848 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78b59b8d64-vnmf7"] Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.996394 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-78b59b8d64-vnmf7"] Jan 10 16:32:18 crc kubenswrapper[5036]: I0110 16:32:18.999443 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzl5n\" (UniqueName: \"kubernetes.io/projected/3f856229-42c0-4fd9-b694-5e0da37fae66-kube-api-access-nzl5n\") pod \"controller-manager-774c5dd755-kvvsh\" (UID: \"3f856229-42c0-4fd9-b694-5e0da37fae66\") " pod="openshift-controller-manager/controller-manager-774c5dd755-kvvsh" Jan 10 16:32:19 crc kubenswrapper[5036]: I0110 16:32:19.009555 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vdpgl" Jan 10 16:32:19 crc kubenswrapper[5036]: I0110 16:32:19.098476 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774c5dd755-kvvsh" Jan 10 16:32:19 crc kubenswrapper[5036]: I0110 16:32:19.499146 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-774c5dd755-kvvsh"] Jan 10 16:32:19 crc kubenswrapper[5036]: W0110 16:32:19.510900 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f856229_42c0_4fd9_b694_5e0da37fae66.slice/crio-de08edac8add2cf8632f1d30713734be01df0eab46c9c8fe8f67d9032cc1f15f WatchSource:0}: Error finding container de08edac8add2cf8632f1d30713734be01df0eab46c9c8fe8f67d9032cc1f15f: Status 404 returned error can't find the container with id de08edac8add2cf8632f1d30713734be01df0eab46c9c8fe8f67d9032cc1f15f Jan 10 16:32:19 crc kubenswrapper[5036]: I0110 16:32:19.958822 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774c5dd755-kvvsh" event={"ID":"3f856229-42c0-4fd9-b694-5e0da37fae66","Type":"ContainerStarted","Data":"de08edac8add2cf8632f1d30713734be01df0eab46c9c8fe8f67d9032cc1f15f"} Jan 10 16:32:20 crc kubenswrapper[5036]: I0110 16:32:20.294251 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nfqrz" Jan 10 16:32:20 crc kubenswrapper[5036]: I0110 16:32:20.294327 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nfqrz" Jan 10 16:32:20 crc kubenswrapper[5036]: I0110 16:32:20.342111 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nfqrz" Jan 10 16:32:20 crc kubenswrapper[5036]: I0110 16:32:20.476865 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-28lcz" Jan 10 16:32:20 crc kubenswrapper[5036]: I0110 16:32:20.476930 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-28lcz" Jan 10 16:32:20 crc kubenswrapper[5036]: I0110 16:32:20.519140 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="587a465d-a421-4b2c-834f-9640d82b1a6f" path="/var/lib/kubelet/pods/587a465d-a421-4b2c-834f-9640d82b1a6f/volumes" Jan 10 16:32:20 crc kubenswrapper[5036]: I0110 16:32:20.534877 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-28lcz" Jan 10 16:32:20 crc kubenswrapper[5036]: I0110 16:32:20.966222 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774c5dd755-kvvsh" event={"ID":"3f856229-42c0-4fd9-b694-5e0da37fae66","Type":"ContainerStarted","Data":"b73b470951588b6c2273c8ca116ca38b3d391b2550558696b0b04adbd8be9d9a"} Jan 10 16:32:20 crc kubenswrapper[5036]: I0110 16:32:20.987779 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-774c5dd755-kvvsh" podStartSLOduration=5.987760422 podStartE2EDuration="5.987760422s" podCreationTimestamp="2026-01-10 16:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:32:20.984174786 +0000 UTC m=+262.854410290" watchObservedRunningTime="2026-01-10 16:32:20.987760422 +0000 UTC m=+262.857995916" Jan 10 16:32:21 crc kubenswrapper[5036]: I0110 16:32:21.007029 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nfqrz" Jan 10 16:32:21 crc kubenswrapper[5036]: I0110 16:32:21.019363 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-28lcz" Jan 10 16:32:21 crc kubenswrapper[5036]: I0110 16:32:21.971512 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-774c5dd755-kvvsh" Jan 10 16:32:21 crc kubenswrapper[5036]: I0110 16:32:21.976670 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-774c5dd755-kvvsh" Jan 10 16:32:23 crc kubenswrapper[5036]: I0110 16:32:23.067865 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-b4zcj" Jan 10 16:32:23 crc kubenswrapper[5036]: I0110 16:32:23.130978 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mjcps"] Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.173079 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" podUID="d8d9ae9f-271e-402d-8ec6-a2e25057090e" containerName="registry" containerID="cri-o://0e0f6c24e4ed9a299edf96b9057e859842eea2dbe60d2977ed60e9c399ad3669" gracePeriod=30 Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.715606 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.830291 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.830601 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8d9ae9f-271e-402d-8ec6-a2e25057090e-bound-sa-token\") pod \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.830632 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drfm4\" (UniqueName: \"kubernetes.io/projected/d8d9ae9f-271e-402d-8ec6-a2e25057090e-kube-api-access-drfm4\") pod \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.830753 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d8d9ae9f-271e-402d-8ec6-a2e25057090e-registry-certificates\") pod \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.830821 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d8d9ae9f-271e-402d-8ec6-a2e25057090e-installation-pull-secrets\") pod \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.830848 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d8d9ae9f-271e-402d-8ec6-a2e25057090e-registry-tls\") pod \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.830882 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d8d9ae9f-271e-402d-8ec6-a2e25057090e-ca-trust-extracted\") pod \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.830963 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8d9ae9f-271e-402d-8ec6-a2e25057090e-trusted-ca\") pod \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\" (UID: \"d8d9ae9f-271e-402d-8ec6-a2e25057090e\") " Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.831951 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8d9ae9f-271e-402d-8ec6-a2e25057090e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d8d9ae9f-271e-402d-8ec6-a2e25057090e" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.832016 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8d9ae9f-271e-402d-8ec6-a2e25057090e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d8d9ae9f-271e-402d-8ec6-a2e25057090e" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.836699 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8d9ae9f-271e-402d-8ec6-a2e25057090e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d8d9ae9f-271e-402d-8ec6-a2e25057090e" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.836948 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8d9ae9f-271e-402d-8ec6-a2e25057090e-kube-api-access-drfm4" (OuterVolumeSpecName: "kube-api-access-drfm4") pod "d8d9ae9f-271e-402d-8ec6-a2e25057090e" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e"). InnerVolumeSpecName "kube-api-access-drfm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.839701 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8d9ae9f-271e-402d-8ec6-a2e25057090e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d8d9ae9f-271e-402d-8ec6-a2e25057090e" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.841312 5036 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8d9ae9f-271e-402d-8ec6-a2e25057090e-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.841342 5036 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8d9ae9f-271e-402d-8ec6-a2e25057090e-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.841357 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drfm4\" (UniqueName: \"kubernetes.io/projected/d8d9ae9f-271e-402d-8ec6-a2e25057090e-kube-api-access-drfm4\") on node \"crc\" DevicePath \"\"" Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.841371 5036 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d8d9ae9f-271e-402d-8ec6-a2e25057090e-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.841383 5036 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d8d9ae9f-271e-402d-8ec6-a2e25057090e-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.842473 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "d8d9ae9f-271e-402d-8ec6-a2e25057090e" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.849762 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8d9ae9f-271e-402d-8ec6-a2e25057090e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d8d9ae9f-271e-402d-8ec6-a2e25057090e" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.852754 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8d9ae9f-271e-402d-8ec6-a2e25057090e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d8d9ae9f-271e-402d-8ec6-a2e25057090e" (UID: "d8d9ae9f-271e-402d-8ec6-a2e25057090e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.942063 5036 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d8d9ae9f-271e-402d-8ec6-a2e25057090e-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 10 16:32:48 crc kubenswrapper[5036]: I0110 16:32:48.942109 5036 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d8d9ae9f-271e-402d-8ec6-a2e25057090e-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 10 16:32:49 crc kubenswrapper[5036]: I0110 16:32:49.142452 5036 generic.go:334] "Generic (PLEG): container finished" podID="d8d9ae9f-271e-402d-8ec6-a2e25057090e" containerID="0e0f6c24e4ed9a299edf96b9057e859842eea2dbe60d2977ed60e9c399ad3669" exitCode=0 Jan 10 16:32:49 crc kubenswrapper[5036]: I0110 16:32:49.142504 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" event={"ID":"d8d9ae9f-271e-402d-8ec6-a2e25057090e","Type":"ContainerDied","Data":"0e0f6c24e4ed9a299edf96b9057e859842eea2dbe60d2977ed60e9c399ad3669"} Jan 10 16:32:49 crc kubenswrapper[5036]: I0110 16:32:49.142532 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" event={"ID":"d8d9ae9f-271e-402d-8ec6-a2e25057090e","Type":"ContainerDied","Data":"5c46d35bab5af69a8e35b49ae719acb555344724cbb82e6100619425a6639311"} Jan 10 16:32:49 crc kubenswrapper[5036]: I0110 16:32:49.142527 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mjcps" Jan 10 16:32:49 crc kubenswrapper[5036]: I0110 16:32:49.142550 5036 scope.go:117] "RemoveContainer" containerID="0e0f6c24e4ed9a299edf96b9057e859842eea2dbe60d2977ed60e9c399ad3669" Jan 10 16:32:49 crc kubenswrapper[5036]: I0110 16:32:49.175119 5036 scope.go:117] "RemoveContainer" containerID="0e0f6c24e4ed9a299edf96b9057e859842eea2dbe60d2977ed60e9c399ad3669" Jan 10 16:32:49 crc kubenswrapper[5036]: E0110 16:32:49.176125 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e0f6c24e4ed9a299edf96b9057e859842eea2dbe60d2977ed60e9c399ad3669\": container with ID starting with 0e0f6c24e4ed9a299edf96b9057e859842eea2dbe60d2977ed60e9c399ad3669 not found: ID does not exist" containerID="0e0f6c24e4ed9a299edf96b9057e859842eea2dbe60d2977ed60e9c399ad3669" Jan 10 16:32:49 crc kubenswrapper[5036]: I0110 16:32:49.176273 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e0f6c24e4ed9a299edf96b9057e859842eea2dbe60d2977ed60e9c399ad3669"} err="failed to get container status \"0e0f6c24e4ed9a299edf96b9057e859842eea2dbe60d2977ed60e9c399ad3669\": rpc error: code = NotFound desc = could not find container \"0e0f6c24e4ed9a299edf96b9057e859842eea2dbe60d2977ed60e9c399ad3669\": container with ID starting with 0e0f6c24e4ed9a299edf96b9057e859842eea2dbe60d2977ed60e9c399ad3669 not found: ID does not exist" Jan 10 16:32:49 crc kubenswrapper[5036]: I0110 16:32:49.202242 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mjcps"] Jan 10 16:32:49 crc kubenswrapper[5036]: I0110 16:32:49.206770 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mjcps"] Jan 10 16:32:50 crc kubenswrapper[5036]: I0110 16:32:50.520431 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8d9ae9f-271e-402d-8ec6-a2e25057090e" path="/var/lib/kubelet/pods/d8d9ae9f-271e-402d-8ec6-a2e25057090e/volumes" Jan 10 16:32:58 crc kubenswrapper[5036]: I0110 16:32:58.272574 5036 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 10 16:33:55 crc kubenswrapper[5036]: I0110 16:33:55.904254 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 16:33:55 crc kubenswrapper[5036]: I0110 16:33:55.904737 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 16:34:25 crc kubenswrapper[5036]: I0110 16:34:25.904383 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 16:34:25 crc kubenswrapper[5036]: I0110 16:34:25.905011 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 16:34:55 crc kubenswrapper[5036]: I0110 16:34:55.904912 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 16:34:55 crc kubenswrapper[5036]: I0110 16:34:55.905618 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 16:34:55 crc kubenswrapper[5036]: I0110 16:34:55.905667 5036 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 16:34:56 crc kubenswrapper[5036]: I0110 16:34:56.117229 5036 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"af7debfeb8d3a1dfa2638975b895daa7ecdb2dc663d2c78b9975fbe6f240f10a"} pod="openshift-machine-config-operator/machine-config-daemon-kqphb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 10 16:34:56 crc kubenswrapper[5036]: I0110 16:34:56.117348 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" containerID="cri-o://af7debfeb8d3a1dfa2638975b895daa7ecdb2dc663d2c78b9975fbe6f240f10a" gracePeriod=600 Jan 10 16:34:57 crc kubenswrapper[5036]: I0110 16:34:57.126104 5036 generic.go:334] "Generic (PLEG): container finished" podID="79756361-741e-4470-831b-6ee092bc6277" containerID="af7debfeb8d3a1dfa2638975b895daa7ecdb2dc663d2c78b9975fbe6f240f10a" exitCode=0 Jan 10 16:34:57 crc kubenswrapper[5036]: I0110 16:34:57.126180 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerDied","Data":"af7debfeb8d3a1dfa2638975b895daa7ecdb2dc663d2c78b9975fbe6f240f10a"} Jan 10 16:34:57 crc kubenswrapper[5036]: I0110 16:34:57.126702 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerStarted","Data":"20378322ebd3e7842d8359f595988c5cc568fc1291f3096caa536a9fbcf9d4b2"} Jan 10 16:34:57 crc kubenswrapper[5036]: I0110 16:34:57.126748 5036 scope.go:117] "RemoveContainer" containerID="aae30e525ba7b9a8f43d42033f9ba0d3065ee1415e836584cee9ed215de60e5f" Jan 10 16:36:39 crc kubenswrapper[5036]: I0110 16:36:39.760122 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-b7rxm"] Jan 10 16:36:39 crc kubenswrapper[5036]: E0110 16:36:39.761656 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8d9ae9f-271e-402d-8ec6-a2e25057090e" containerName="registry" Jan 10 16:36:39 crc kubenswrapper[5036]: I0110 16:36:39.761740 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8d9ae9f-271e-402d-8ec6-a2e25057090e" containerName="registry" Jan 10 16:36:39 crc kubenswrapper[5036]: I0110 16:36:39.761875 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8d9ae9f-271e-402d-8ec6-a2e25057090e" containerName="registry" Jan 10 16:36:39 crc kubenswrapper[5036]: I0110 16:36:39.762291 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-b7rxm" Jan 10 16:36:39 crc kubenswrapper[5036]: I0110 16:36:39.770264 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 10 16:36:39 crc kubenswrapper[5036]: I0110 16:36:39.770335 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 10 16:36:39 crc kubenswrapper[5036]: I0110 16:36:39.770701 5036 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-qd7jh" Jan 10 16:36:39 crc kubenswrapper[5036]: I0110 16:36:39.771386 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq86k\" (UniqueName: \"kubernetes.io/projected/2574f8f4-e56e-4d7e-b181-5e01d69b1485-kube-api-access-fq86k\") pod \"cert-manager-cainjector-cf98fcc89-b7rxm\" (UID: \"2574f8f4-e56e-4d7e-b181-5e01d69b1485\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-b7rxm" Jan 10 16:36:39 crc kubenswrapper[5036]: I0110 16:36:39.774817 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-hrdzf"] Jan 10 16:36:39 crc kubenswrapper[5036]: I0110 16:36:39.775644 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-hrdzf" Jan 10 16:36:39 crc kubenswrapper[5036]: I0110 16:36:39.777777 5036 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-b4djx" Jan 10 16:36:39 crc kubenswrapper[5036]: I0110 16:36:39.785408 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-b7rxm"] Jan 10 16:36:39 crc kubenswrapper[5036]: I0110 16:36:39.797442 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-pgpxj"] Jan 10 16:36:39 crc kubenswrapper[5036]: I0110 16:36:39.798142 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-pgpxj" Jan 10 16:36:39 crc kubenswrapper[5036]: I0110 16:36:39.799817 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-hrdzf"] Jan 10 16:36:39 crc kubenswrapper[5036]: I0110 16:36:39.799983 5036 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-v9lsq" Jan 10 16:36:39 crc kubenswrapper[5036]: I0110 16:36:39.868253 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-pgpxj"] Jan 10 16:36:39 crc kubenswrapper[5036]: I0110 16:36:39.875837 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq86k\" (UniqueName: \"kubernetes.io/projected/2574f8f4-e56e-4d7e-b181-5e01d69b1485-kube-api-access-fq86k\") pod \"cert-manager-cainjector-cf98fcc89-b7rxm\" (UID: \"2574f8f4-e56e-4d7e-b181-5e01d69b1485\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-b7rxm" Jan 10 16:36:39 crc kubenswrapper[5036]: I0110 16:36:39.877496 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fsgc\" (UniqueName: \"kubernetes.io/projected/ef12a866-7983-4859-8d00-6ba6ed292af3-kube-api-access-4fsgc\") pod \"cert-manager-858654f9db-hrdzf\" (UID: \"ef12a866-7983-4859-8d00-6ba6ed292af3\") " pod="cert-manager/cert-manager-858654f9db-hrdzf" Jan 10 16:36:39 crc kubenswrapper[5036]: I0110 16:36:39.894321 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq86k\" (UniqueName: \"kubernetes.io/projected/2574f8f4-e56e-4d7e-b181-5e01d69b1485-kube-api-access-fq86k\") pod \"cert-manager-cainjector-cf98fcc89-b7rxm\" (UID: \"2574f8f4-e56e-4d7e-b181-5e01d69b1485\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-b7rxm" Jan 10 16:36:39 crc kubenswrapper[5036]: I0110 16:36:39.979415 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fsgc\" (UniqueName: \"kubernetes.io/projected/ef12a866-7983-4859-8d00-6ba6ed292af3-kube-api-access-4fsgc\") pod \"cert-manager-858654f9db-hrdzf\" (UID: \"ef12a866-7983-4859-8d00-6ba6ed292af3\") " pod="cert-manager/cert-manager-858654f9db-hrdzf" Jan 10 16:36:39 crc kubenswrapper[5036]: I0110 16:36:39.979589 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg5bc\" (UniqueName: \"kubernetes.io/projected/b15af209-c459-40f3-affc-0d5a3d2b031d-kube-api-access-xg5bc\") pod \"cert-manager-webhook-687f57d79b-pgpxj\" (UID: \"b15af209-c459-40f3-affc-0d5a3d2b031d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-pgpxj" Jan 10 16:36:39 crc kubenswrapper[5036]: I0110 16:36:39.999142 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fsgc\" (UniqueName: \"kubernetes.io/projected/ef12a866-7983-4859-8d00-6ba6ed292af3-kube-api-access-4fsgc\") pod \"cert-manager-858654f9db-hrdzf\" (UID: \"ef12a866-7983-4859-8d00-6ba6ed292af3\") " pod="cert-manager/cert-manager-858654f9db-hrdzf" Jan 10 16:36:40 crc kubenswrapper[5036]: I0110 16:36:40.080802 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg5bc\" (UniqueName: \"kubernetes.io/projected/b15af209-c459-40f3-affc-0d5a3d2b031d-kube-api-access-xg5bc\") pod \"cert-manager-webhook-687f57d79b-pgpxj\" (UID: \"b15af209-c459-40f3-affc-0d5a3d2b031d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-pgpxj" Jan 10 16:36:40 crc kubenswrapper[5036]: I0110 16:36:40.096723 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg5bc\" (UniqueName: \"kubernetes.io/projected/b15af209-c459-40f3-affc-0d5a3d2b031d-kube-api-access-xg5bc\") pod \"cert-manager-webhook-687f57d79b-pgpxj\" (UID: \"b15af209-c459-40f3-affc-0d5a3d2b031d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-pgpxj" Jan 10 16:36:40 crc kubenswrapper[5036]: I0110 16:36:40.148866 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-b7rxm" Jan 10 16:36:40 crc kubenswrapper[5036]: I0110 16:36:40.160785 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-hrdzf" Jan 10 16:36:40 crc kubenswrapper[5036]: I0110 16:36:40.166824 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-pgpxj" Jan 10 16:36:40 crc kubenswrapper[5036]: I0110 16:36:40.402847 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-pgpxj"] Jan 10 16:36:40 crc kubenswrapper[5036]: W0110 16:36:40.405785 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb15af209_c459_40f3_affc_0d5a3d2b031d.slice/crio-56b3a6cce49ddb14e84b202424844ee316b2002b13260b63a555dde24738b899 WatchSource:0}: Error finding container 56b3a6cce49ddb14e84b202424844ee316b2002b13260b63a555dde24738b899: Status 404 returned error can't find the container with id 56b3a6cce49ddb14e84b202424844ee316b2002b13260b63a555dde24738b899 Jan 10 16:36:40 crc kubenswrapper[5036]: I0110 16:36:40.407847 5036 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 10 16:36:40 crc kubenswrapper[5036]: I0110 16:36:40.615271 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-hrdzf"] Jan 10 16:36:40 crc kubenswrapper[5036]: W0110 16:36:40.619974 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef12a866_7983_4859_8d00_6ba6ed292af3.slice/crio-b62d398e89d447703c124e94fcc47ff11126ed70dd4e67377856a26dd457dcfb WatchSource:0}: Error finding container b62d398e89d447703c124e94fcc47ff11126ed70dd4e67377856a26dd457dcfb: Status 404 returned error can't find the container with id b62d398e89d447703c124e94fcc47ff11126ed70dd4e67377856a26dd457dcfb Jan 10 16:36:40 crc kubenswrapper[5036]: I0110 16:36:40.637170 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-b7rxm"] Jan 10 16:36:40 crc kubenswrapper[5036]: W0110 16:36:40.642361 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2574f8f4_e56e_4d7e_b181_5e01d69b1485.slice/crio-85a439f2bf59cdaba74a71bc98de9f5a8e6de01f90db337c74d4250457c62e6d WatchSource:0}: Error finding container 85a439f2bf59cdaba74a71bc98de9f5a8e6de01f90db337c74d4250457c62e6d: Status 404 returned error can't find the container with id 85a439f2bf59cdaba74a71bc98de9f5a8e6de01f90db337c74d4250457c62e6d Jan 10 16:36:40 crc kubenswrapper[5036]: I0110 16:36:40.927827 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-b7rxm" event={"ID":"2574f8f4-e56e-4d7e-b181-5e01d69b1485","Type":"ContainerStarted","Data":"85a439f2bf59cdaba74a71bc98de9f5a8e6de01f90db337c74d4250457c62e6d"} Jan 10 16:36:40 crc kubenswrapper[5036]: I0110 16:36:40.929991 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-hrdzf" event={"ID":"ef12a866-7983-4859-8d00-6ba6ed292af3","Type":"ContainerStarted","Data":"b62d398e89d447703c124e94fcc47ff11126ed70dd4e67377856a26dd457dcfb"} Jan 10 16:36:40 crc kubenswrapper[5036]: I0110 16:36:40.930832 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-pgpxj" event={"ID":"b15af209-c459-40f3-affc-0d5a3d2b031d","Type":"ContainerStarted","Data":"56b3a6cce49ddb14e84b202424844ee316b2002b13260b63a555dde24738b899"} Jan 10 16:36:44 crc kubenswrapper[5036]: I0110 16:36:44.954904 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-pgpxj" event={"ID":"b15af209-c459-40f3-affc-0d5a3d2b031d","Type":"ContainerStarted","Data":"4453dd0ca84ae117261a62929d43e4d3963f656e33ce9fa60c9d62b0ec1cf823"} Jan 10 16:36:44 crc kubenswrapper[5036]: I0110 16:36:44.955879 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-pgpxj" Jan 10 16:36:44 crc kubenswrapper[5036]: I0110 16:36:44.957962 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-b7rxm" event={"ID":"2574f8f4-e56e-4d7e-b181-5e01d69b1485","Type":"ContainerStarted","Data":"fb85ec0e5563b0415a981e6e592cbe6ff060c4c62431404b7b630235798b81d4"} Jan 10 16:36:44 crc kubenswrapper[5036]: I0110 16:36:44.960369 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-hrdzf" event={"ID":"ef12a866-7983-4859-8d00-6ba6ed292af3","Type":"ContainerStarted","Data":"3164e5448e39fd2e48216bc9c0f954dd6197bd856264a07cfc579f7efed917fa"} Jan 10 16:36:44 crc kubenswrapper[5036]: I0110 16:36:44.981076 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-pgpxj" podStartSLOduration=2.615573303 podStartE2EDuration="5.981045839s" podCreationTimestamp="2026-01-10 16:36:39 +0000 UTC" firstStartedPulling="2026-01-10 16:36:40.407620168 +0000 UTC m=+522.277855662" lastFinishedPulling="2026-01-10 16:36:43.773092704 +0000 UTC m=+525.643328198" observedRunningTime="2026-01-10 16:36:44.972603385 +0000 UTC m=+526.842838909" watchObservedRunningTime="2026-01-10 16:36:44.981045839 +0000 UTC m=+526.851281363" Jan 10 16:36:44 crc kubenswrapper[5036]: I0110 16:36:44.993057 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-b7rxm" podStartSLOduration=2.844910414 podStartE2EDuration="5.993032863s" podCreationTimestamp="2026-01-10 16:36:39 +0000 UTC" firstStartedPulling="2026-01-10 16:36:40.646238014 +0000 UTC m=+522.516473508" lastFinishedPulling="2026-01-10 16:36:43.794360453 +0000 UTC m=+525.664595957" observedRunningTime="2026-01-10 16:36:44.989904813 +0000 UTC m=+526.860140337" watchObservedRunningTime="2026-01-10 16:36:44.993032863 +0000 UTC m=+526.863268367" Jan 10 16:36:45 crc kubenswrapper[5036]: I0110 16:36:45.015607 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-hrdzf" podStartSLOduration=2.846061943 podStartE2EDuration="6.015574813s" podCreationTimestamp="2026-01-10 16:36:39 +0000 UTC" firstStartedPulling="2026-01-10 16:36:40.622275608 +0000 UTC m=+522.492511102" lastFinishedPulling="2026-01-10 16:36:43.791788458 +0000 UTC m=+525.662023972" observedRunningTime="2026-01-10 16:36:45.011540151 +0000 UTC m=+526.881775655" watchObservedRunningTime="2026-01-10 16:36:45.015574813 +0000 UTC m=+526.885810307" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.487052 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-c4vw5"] Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.488123 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="ovn-controller" containerID="cri-o://450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6" gracePeriod=30 Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.488369 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="sbdb" containerID="cri-o://d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402" gracePeriod=30 Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.488435 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="nbdb" containerID="cri-o://2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1" gracePeriod=30 Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.488510 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="ovn-acl-logging" containerID="cri-o://52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c" gracePeriod=30 Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.488529 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="kube-rbac-proxy-node" containerID="cri-o://895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c" gracePeriod=30 Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.488698 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="northd" containerID="cri-o://12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e" gracePeriod=30 Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.488767 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a" gracePeriod=30 Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.558271 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="ovnkube-controller" containerID="cri-o://e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f" gracePeriod=30 Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.835694 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c4vw5_98fa8c41-2298-4b26-849a-806cc77bcc40/ovn-acl-logging/0.log" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.836466 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c4vw5_98fa8c41-2298-4b26-849a-806cc77bcc40/ovn-controller/0.log" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.836910 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.891892 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hfkvk"] Jan 10 16:36:49 crc kubenswrapper[5036]: E0110 16:36:49.892081 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="kube-rbac-proxy-ovn-metrics" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.892092 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="kube-rbac-proxy-ovn-metrics" Jan 10 16:36:49 crc kubenswrapper[5036]: E0110 16:36:49.892100 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="nbdb" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.892106 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="nbdb" Jan 10 16:36:49 crc kubenswrapper[5036]: E0110 16:36:49.892116 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="ovnkube-controller" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.892122 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="ovnkube-controller" Jan 10 16:36:49 crc kubenswrapper[5036]: E0110 16:36:49.892138 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="kube-rbac-proxy-node" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.892144 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="kube-rbac-proxy-node" Jan 10 16:36:49 crc kubenswrapper[5036]: E0110 16:36:49.892150 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="northd" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.892155 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="northd" Jan 10 16:36:49 crc kubenswrapper[5036]: E0110 16:36:49.892168 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="ovn-acl-logging" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.892174 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="ovn-acl-logging" Jan 10 16:36:49 crc kubenswrapper[5036]: E0110 16:36:49.892184 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="kubecfg-setup" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.892190 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="kubecfg-setup" Jan 10 16:36:49 crc kubenswrapper[5036]: E0110 16:36:49.892196 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="ovn-controller" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.892203 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="ovn-controller" Jan 10 16:36:49 crc kubenswrapper[5036]: E0110 16:36:49.892211 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="sbdb" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.892217 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="sbdb" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.892297 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="sbdb" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.892307 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="ovnkube-controller" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.892313 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="nbdb" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.892322 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="ovn-acl-logging" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.892330 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="northd" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.892339 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="kube-rbac-proxy-node" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.892346 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="ovn-controller" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.892353 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerName="kube-rbac-proxy-ovn-metrics" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.894164 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.905220 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2dd21cae-461f-418c-9a30-2e23ba28c555-ovnkube-script-lib\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.905274 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2dd21cae-461f-418c-9a30-2e23ba28c555-ovnkube-config\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.905295 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-run-openvswitch\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.905408 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-host-kubelet\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.905433 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-etc-openvswitch\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.905456 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2dd21cae-461f-418c-9a30-2e23ba28c555-env-overrides\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.905478 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-run-systemd\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.905513 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpdgw\" (UniqueName: \"kubernetes.io/projected/2dd21cae-461f-418c-9a30-2e23ba28c555-kube-api-access-jpdgw\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.905585 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.905621 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-node-log\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.905785 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-host-run-netns\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.905867 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-var-lib-openvswitch\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.905914 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-log-socket\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.905947 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-run-ovn\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.905971 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2dd21cae-461f-418c-9a30-2e23ba28c555-ovn-node-metrics-cert\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.905996 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-systemd-units\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.906033 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-host-cni-bin\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.906055 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-host-cni-netd\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.906086 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-host-run-ovn-kubernetes\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.906108 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-host-slash\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.999359 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c4vw5_98fa8c41-2298-4b26-849a-806cc77bcc40/ovn-acl-logging/0.log" Jan 10 16:36:49 crc kubenswrapper[5036]: I0110 16:36:49.999954 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-c4vw5_98fa8c41-2298-4b26-849a-806cc77bcc40/ovn-controller/0.log" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000362 5036 generic.go:334] "Generic (PLEG): container finished" podID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerID="e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f" exitCode=0 Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000387 5036 generic.go:334] "Generic (PLEG): container finished" podID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerID="d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402" exitCode=0 Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000398 5036 generic.go:334] "Generic (PLEG): container finished" podID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerID="2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1" exitCode=0 Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000408 5036 generic.go:334] "Generic (PLEG): container finished" podID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerID="12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e" exitCode=0 Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000418 5036 generic.go:334] "Generic (PLEG): container finished" podID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerID="a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a" exitCode=0 Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000427 5036 generic.go:334] "Generic (PLEG): container finished" podID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerID="895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c" exitCode=0 Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000434 5036 generic.go:334] "Generic (PLEG): container finished" podID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerID="52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c" exitCode=143 Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000442 5036 generic.go:334] "Generic (PLEG): container finished" podID="98fa8c41-2298-4b26-849a-806cc77bcc40" containerID="450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6" exitCode=143 Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000451 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" event={"ID":"98fa8c41-2298-4b26-849a-806cc77bcc40","Type":"ContainerDied","Data":"e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000473 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000493 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" event={"ID":"98fa8c41-2298-4b26-849a-806cc77bcc40","Type":"ContainerDied","Data":"d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000506 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" event={"ID":"98fa8c41-2298-4b26-849a-806cc77bcc40","Type":"ContainerDied","Data":"2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000526 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" event={"ID":"98fa8c41-2298-4b26-849a-806cc77bcc40","Type":"ContainerDied","Data":"12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000538 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" event={"ID":"98fa8c41-2298-4b26-849a-806cc77bcc40","Type":"ContainerDied","Data":"a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000549 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" event={"ID":"98fa8c41-2298-4b26-849a-806cc77bcc40","Type":"ContainerDied","Data":"895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000562 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000574 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000580 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000587 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" event={"ID":"98fa8c41-2298-4b26-849a-806cc77bcc40","Type":"ContainerDied","Data":"52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000595 5036 scope.go:117] "RemoveContainer" containerID="e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000594 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000727 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000752 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000767 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000778 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000790 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000800 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000811 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000822 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000842 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" event={"ID":"98fa8c41-2298-4b26-849a-806cc77bcc40","Type":"ContainerDied","Data":"450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000870 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000884 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000894 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000904 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000915 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000925 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000935 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000946 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000956 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000972 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-c4vw5" event={"ID":"98fa8c41-2298-4b26-849a-806cc77bcc40","Type":"ContainerDied","Data":"7c56941243f558ebeb0929b024f0db8f4f29f0fdf7b48537f8ec36e00cdbf1c7"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.000991 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.001003 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.001015 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.001025 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.001036 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.001047 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.001057 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.001068 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.001078 5036 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.002472 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-44nd6_91a78516-865b-40eb-8545-8f24206fe927/kube-multus/0.log" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.002542 5036 generic.go:334] "Generic (PLEG): container finished" podID="91a78516-865b-40eb-8545-8f24206fe927" containerID="4acfe31af1f4adc287a0f51bd888e7ad0d76662326dfa32e1fd87c6efcddb7f9" exitCode=2 Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.002580 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-44nd6" event={"ID":"91a78516-865b-40eb-8545-8f24206fe927","Type":"ContainerDied","Data":"4acfe31af1f4adc287a0f51bd888e7ad0d76662326dfa32e1fd87c6efcddb7f9"} Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.003291 5036 scope.go:117] "RemoveContainer" containerID="4acfe31af1f4adc287a0f51bd888e7ad0d76662326dfa32e1fd87c6efcddb7f9" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.007823 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b65c4\" (UniqueName: \"kubernetes.io/projected/98fa8c41-2298-4b26-849a-806cc77bcc40-kube-api-access-b65c4\") pod \"98fa8c41-2298-4b26-849a-806cc77bcc40\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.007889 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-run-netns\") pod \"98fa8c41-2298-4b26-849a-806cc77bcc40\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.008090 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-etc-openvswitch\") pod \"98fa8c41-2298-4b26-849a-806cc77bcc40\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.008139 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-systemd-units\") pod \"98fa8c41-2298-4b26-849a-806cc77bcc40\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.008183 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-var-lib-cni-networks-ovn-kubernetes\") pod \"98fa8c41-2298-4b26-849a-806cc77bcc40\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.008238 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-cni-bin\") pod \"98fa8c41-2298-4b26-849a-806cc77bcc40\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.008289 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-run-openvswitch\") pod \"98fa8c41-2298-4b26-849a-806cc77bcc40\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.008331 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-log-socket\") pod \"98fa8c41-2298-4b26-849a-806cc77bcc40\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.008404 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/98fa8c41-2298-4b26-849a-806cc77bcc40-env-overrides\") pod \"98fa8c41-2298-4b26-849a-806cc77bcc40\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.007992 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "98fa8c41-2298-4b26-849a-806cc77bcc40" (UID: "98fa8c41-2298-4b26-849a-806cc77bcc40"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.008440 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "98fa8c41-2298-4b26-849a-806cc77bcc40" (UID: "98fa8c41-2298-4b26-849a-806cc77bcc40"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.008540 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "98fa8c41-2298-4b26-849a-806cc77bcc40" (UID: "98fa8c41-2298-4b26-849a-806cc77bcc40"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.008462 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/98fa8c41-2298-4b26-849a-806cc77bcc40-ovn-node-metrics-cert\") pod \"98fa8c41-2298-4b26-849a-806cc77bcc40\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.008661 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-node-log\") pod \"98fa8c41-2298-4b26-849a-806cc77bcc40\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.008744 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-slash\") pod \"98fa8c41-2298-4b26-849a-806cc77bcc40\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.008795 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/98fa8c41-2298-4b26-849a-806cc77bcc40-ovnkube-config\") pod \"98fa8c41-2298-4b26-849a-806cc77bcc40\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.008808 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "98fa8c41-2298-4b26-849a-806cc77bcc40" (UID: "98fa8c41-2298-4b26-849a-806cc77bcc40"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.008844 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-var-lib-openvswitch\") pod \"98fa8c41-2298-4b26-849a-806cc77bcc40\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.008859 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "98fa8c41-2298-4b26-849a-806cc77bcc40" (UID: "98fa8c41-2298-4b26-849a-806cc77bcc40"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.008852 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-log-socket" (OuterVolumeSpecName: "log-socket") pod "98fa8c41-2298-4b26-849a-806cc77bcc40" (UID: "98fa8c41-2298-4b26-849a-806cc77bcc40"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.008893 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-slash" (OuterVolumeSpecName: "host-slash") pod "98fa8c41-2298-4b26-849a-806cc77bcc40" (UID: "98fa8c41-2298-4b26-849a-806cc77bcc40"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.008906 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-run-ovn-kubernetes\") pod \"98fa8c41-2298-4b26-849a-806cc77bcc40\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.008916 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "98fa8c41-2298-4b26-849a-806cc77bcc40" (UID: "98fa8c41-2298-4b26-849a-806cc77bcc40"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.008928 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-node-log" (OuterVolumeSpecName: "node-log") pod "98fa8c41-2298-4b26-849a-806cc77bcc40" (UID: "98fa8c41-2298-4b26-849a-806cc77bcc40"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.008949 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "98fa8c41-2298-4b26-849a-806cc77bcc40" (UID: "98fa8c41-2298-4b26-849a-806cc77bcc40"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.008981 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-kubelet\") pod \"98fa8c41-2298-4b26-849a-806cc77bcc40\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.008991 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "98fa8c41-2298-4b26-849a-806cc77bcc40" (UID: "98fa8c41-2298-4b26-849a-806cc77bcc40"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009024 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-run-ovn\") pod \"98fa8c41-2298-4b26-849a-806cc77bcc40\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009058 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "98fa8c41-2298-4b26-849a-806cc77bcc40" (UID: "98fa8c41-2298-4b26-849a-806cc77bcc40"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009071 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-cni-netd\") pod \"98fa8c41-2298-4b26-849a-806cc77bcc40\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009105 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "98fa8c41-2298-4b26-849a-806cc77bcc40" (UID: "98fa8c41-2298-4b26-849a-806cc77bcc40"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009123 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/98fa8c41-2298-4b26-849a-806cc77bcc40-ovnkube-script-lib\") pod \"98fa8c41-2298-4b26-849a-806cc77bcc40\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009194 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-run-systemd\") pod \"98fa8c41-2298-4b26-849a-806cc77bcc40\" (UID: \"98fa8c41-2298-4b26-849a-806cc77bcc40\") " Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009199 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "98fa8c41-2298-4b26-849a-806cc77bcc40" (UID: "98fa8c41-2298-4b26-849a-806cc77bcc40"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009327 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98fa8c41-2298-4b26-849a-806cc77bcc40-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "98fa8c41-2298-4b26-849a-806cc77bcc40" (UID: "98fa8c41-2298-4b26-849a-806cc77bcc40"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009447 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-host-run-netns\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009474 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-host-run-netns\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009563 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-var-lib-openvswitch\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009633 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-log-socket\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009660 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-var-lib-openvswitch\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009731 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-log-socket\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009727 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2dd21cae-461f-418c-9a30-2e23ba28c555-ovn-node-metrics-cert\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009791 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-run-ovn\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009812 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-systemd-units\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009849 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-host-cni-netd\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009871 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-host-cni-bin\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009872 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-run-ovn\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009885 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98fa8c41-2298-4b26-849a-806cc77bcc40-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "98fa8c41-2298-4b26-849a-806cc77bcc40" (UID: "98fa8c41-2298-4b26-849a-806cc77bcc40"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009918 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-host-cni-netd\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009945 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-host-run-ovn-kubernetes\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009950 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-host-cni-bin\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009918 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-host-run-ovn-kubernetes\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009986 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-host-slash\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.009993 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-systemd-units\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.010012 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2dd21cae-461f-418c-9a30-2e23ba28c555-ovnkube-script-lib\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.010378 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2dd21cae-461f-418c-9a30-2e23ba28c555-ovnkube-config\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.010448 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-run-openvswitch\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.010513 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-host-kubelet\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.010563 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2dd21cae-461f-418c-9a30-2e23ba28c555-env-overrides\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.010609 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-etc-openvswitch\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.010644 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-run-systemd\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.010726 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-run-systemd\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.010782 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2dd21cae-461f-418c-9a30-2e23ba28c555-ovnkube-script-lib\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.010842 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-host-kubelet\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.010868 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-etc-openvswitch\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.010889 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-run-openvswitch\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.010931 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpdgw\" (UniqueName: \"kubernetes.io/projected/2dd21cae-461f-418c-9a30-2e23ba28c555-kube-api-access-jpdgw\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.010977 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.011002 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-node-log\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.011098 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.011262 5036 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.011280 5036 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.011290 5036 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.011301 5036 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.011313 5036 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.011323 5036 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.011333 5036 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.011341 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2dd21cae-461f-418c-9a30-2e23ba28c555-env-overrides\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.011344 5036 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.011371 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-node-log\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.011387 5036 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.011403 5036 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.011414 5036 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-log-socket\") on node \"crc\" DevicePath \"\"" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.011425 5036 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/98fa8c41-2298-4b26-849a-806cc77bcc40-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.011435 5036 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-node-log\") on node \"crc\" DevicePath \"\"" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.011446 5036 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-host-slash\") on node \"crc\" DevicePath \"\"" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.011458 5036 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/98fa8c41-2298-4b26-849a-806cc77bcc40-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.011470 5036 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.011469 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98fa8c41-2298-4b26-849a-806cc77bcc40-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "98fa8c41-2298-4b26-849a-806cc77bcc40" (UID: "98fa8c41-2298-4b26-849a-806cc77bcc40"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.011635 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2dd21cae-461f-418c-9a30-2e23ba28c555-host-slash\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.012161 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2dd21cae-461f-418c-9a30-2e23ba28c555-ovnkube-config\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.015201 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2dd21cae-461f-418c-9a30-2e23ba28c555-ovn-node-metrics-cert\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.015713 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98fa8c41-2298-4b26-849a-806cc77bcc40-kube-api-access-b65c4" (OuterVolumeSpecName: "kube-api-access-b65c4") pod "98fa8c41-2298-4b26-849a-806cc77bcc40" (UID: "98fa8c41-2298-4b26-849a-806cc77bcc40"). InnerVolumeSpecName "kube-api-access-b65c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.015767 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98fa8c41-2298-4b26-849a-806cc77bcc40-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "98fa8c41-2298-4b26-849a-806cc77bcc40" (UID: "98fa8c41-2298-4b26-849a-806cc77bcc40"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.029421 5036 scope.go:117] "RemoveContainer" containerID="d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.030241 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "98fa8c41-2298-4b26-849a-806cc77bcc40" (UID: "98fa8c41-2298-4b26-849a-806cc77bcc40"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.034008 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpdgw\" (UniqueName: \"kubernetes.io/projected/2dd21cae-461f-418c-9a30-2e23ba28c555-kube-api-access-jpdgw\") pod \"ovnkube-node-hfkvk\" (UID: \"2dd21cae-461f-418c-9a30-2e23ba28c555\") " pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.045904 5036 scope.go:117] "RemoveContainer" containerID="2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.072480 5036 scope.go:117] "RemoveContainer" containerID="12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.095120 5036 scope.go:117] "RemoveContainer" containerID="a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.111987 5036 scope.go:117] "RemoveContainer" containerID="895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.112645 5036 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/98fa8c41-2298-4b26-849a-806cc77bcc40-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.112668 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b65c4\" (UniqueName: \"kubernetes.io/projected/98fa8c41-2298-4b26-849a-806cc77bcc40-kube-api-access-b65c4\") on node \"crc\" DevicePath \"\"" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.112695 5036 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/98fa8c41-2298-4b26-849a-806cc77bcc40-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.112704 5036 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/98fa8c41-2298-4b26-849a-806cc77bcc40-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.129651 5036 scope.go:117] "RemoveContainer" containerID="52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.146950 5036 scope.go:117] "RemoveContainer" containerID="450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.160766 5036 scope.go:117] "RemoveContainer" containerID="8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.169901 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-pgpxj" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.177798 5036 scope.go:117] "RemoveContainer" containerID="e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f" Jan 10 16:36:50 crc kubenswrapper[5036]: E0110 16:36:50.178187 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f\": container with ID starting with e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f not found: ID does not exist" containerID="e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.178225 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f"} err="failed to get container status \"e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f\": rpc error: code = NotFound desc = could not find container \"e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f\": container with ID starting with e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.178282 5036 scope.go:117] "RemoveContainer" containerID="d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402" Jan 10 16:36:50 crc kubenswrapper[5036]: E0110 16:36:50.178615 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402\": container with ID starting with d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402 not found: ID does not exist" containerID="d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.178649 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402"} err="failed to get container status \"d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402\": rpc error: code = NotFound desc = could not find container \"d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402\": container with ID starting with d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402 not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.178668 5036 scope.go:117] "RemoveContainer" containerID="2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1" Jan 10 16:36:50 crc kubenswrapper[5036]: E0110 16:36:50.179024 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1\": container with ID starting with 2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1 not found: ID does not exist" containerID="2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.179048 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1"} err="failed to get container status \"2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1\": rpc error: code = NotFound desc = could not find container \"2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1\": container with ID starting with 2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1 not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.179063 5036 scope.go:117] "RemoveContainer" containerID="12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e" Jan 10 16:36:50 crc kubenswrapper[5036]: E0110 16:36:50.179335 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e\": container with ID starting with 12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e not found: ID does not exist" containerID="12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.179364 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e"} err="failed to get container status \"12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e\": rpc error: code = NotFound desc = could not find container \"12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e\": container with ID starting with 12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.179379 5036 scope.go:117] "RemoveContainer" containerID="a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a" Jan 10 16:36:50 crc kubenswrapper[5036]: E0110 16:36:50.179720 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a\": container with ID starting with a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a not found: ID does not exist" containerID="a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.179760 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a"} err="failed to get container status \"a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a\": rpc error: code = NotFound desc = could not find container \"a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a\": container with ID starting with a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.179774 5036 scope.go:117] "RemoveContainer" containerID="895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c" Jan 10 16:36:50 crc kubenswrapper[5036]: E0110 16:36:50.180098 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c\": container with ID starting with 895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c not found: ID does not exist" containerID="895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.180119 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c"} err="failed to get container status \"895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c\": rpc error: code = NotFound desc = could not find container \"895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c\": container with ID starting with 895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.180135 5036 scope.go:117] "RemoveContainer" containerID="52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c" Jan 10 16:36:50 crc kubenswrapper[5036]: E0110 16:36:50.180502 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c\": container with ID starting with 52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c not found: ID does not exist" containerID="52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.180537 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c"} err="failed to get container status \"52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c\": rpc error: code = NotFound desc = could not find container \"52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c\": container with ID starting with 52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.180562 5036 scope.go:117] "RemoveContainer" containerID="450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6" Jan 10 16:36:50 crc kubenswrapper[5036]: E0110 16:36:50.181505 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6\": container with ID starting with 450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6 not found: ID does not exist" containerID="450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.181536 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6"} err="failed to get container status \"450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6\": rpc error: code = NotFound desc = could not find container \"450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6\": container with ID starting with 450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6 not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.181559 5036 scope.go:117] "RemoveContainer" containerID="8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31" Jan 10 16:36:50 crc kubenswrapper[5036]: E0110 16:36:50.181941 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31\": container with ID starting with 8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31 not found: ID does not exist" containerID="8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.181966 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31"} err="failed to get container status \"8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31\": rpc error: code = NotFound desc = could not find container \"8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31\": container with ID starting with 8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31 not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.181979 5036 scope.go:117] "RemoveContainer" containerID="e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.182847 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f"} err="failed to get container status \"e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f\": rpc error: code = NotFound desc = could not find container \"e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f\": container with ID starting with e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.182869 5036 scope.go:117] "RemoveContainer" containerID="d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.184139 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402"} err="failed to get container status \"d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402\": rpc error: code = NotFound desc = could not find container \"d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402\": container with ID starting with d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402 not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.184166 5036 scope.go:117] "RemoveContainer" containerID="2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.184713 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1"} err="failed to get container status \"2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1\": rpc error: code = NotFound desc = could not find container \"2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1\": container with ID starting with 2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1 not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.184739 5036 scope.go:117] "RemoveContainer" containerID="12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.186516 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e"} err="failed to get container status \"12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e\": rpc error: code = NotFound desc = could not find container \"12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e\": container with ID starting with 12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.186548 5036 scope.go:117] "RemoveContainer" containerID="a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.186845 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a"} err="failed to get container status \"a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a\": rpc error: code = NotFound desc = could not find container \"a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a\": container with ID starting with a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.186862 5036 scope.go:117] "RemoveContainer" containerID="895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.187171 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c"} err="failed to get container status \"895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c\": rpc error: code = NotFound desc = could not find container \"895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c\": container with ID starting with 895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.187195 5036 scope.go:117] "RemoveContainer" containerID="52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.187471 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c"} err="failed to get container status \"52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c\": rpc error: code = NotFound desc = could not find container \"52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c\": container with ID starting with 52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.187530 5036 scope.go:117] "RemoveContainer" containerID="450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.187782 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6"} err="failed to get container status \"450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6\": rpc error: code = NotFound desc = could not find container \"450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6\": container with ID starting with 450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6 not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.187806 5036 scope.go:117] "RemoveContainer" containerID="8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.187984 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31"} err="failed to get container status \"8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31\": rpc error: code = NotFound desc = could not find container \"8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31\": container with ID starting with 8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31 not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.188008 5036 scope.go:117] "RemoveContainer" containerID="e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.188169 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f"} err="failed to get container status \"e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f\": rpc error: code = NotFound desc = could not find container \"e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f\": container with ID starting with e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.188185 5036 scope.go:117] "RemoveContainer" containerID="d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.188341 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402"} err="failed to get container status \"d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402\": rpc error: code = NotFound desc = could not find container \"d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402\": container with ID starting with d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402 not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.188362 5036 scope.go:117] "RemoveContainer" containerID="2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.188515 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1"} err="failed to get container status \"2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1\": rpc error: code = NotFound desc = could not find container \"2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1\": container with ID starting with 2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1 not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.188529 5036 scope.go:117] "RemoveContainer" containerID="12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.188705 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e"} err="failed to get container status \"12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e\": rpc error: code = NotFound desc = could not find container \"12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e\": container with ID starting with 12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.188725 5036 scope.go:117] "RemoveContainer" containerID="a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.188877 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a"} err="failed to get container status \"a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a\": rpc error: code = NotFound desc = could not find container \"a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a\": container with ID starting with a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.188893 5036 scope.go:117] "RemoveContainer" containerID="895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.190538 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c"} err="failed to get container status \"895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c\": rpc error: code = NotFound desc = could not find container \"895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c\": container with ID starting with 895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.190590 5036 scope.go:117] "RemoveContainer" containerID="52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.193890 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c"} err="failed to get container status \"52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c\": rpc error: code = NotFound desc = could not find container \"52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c\": container with ID starting with 52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.193936 5036 scope.go:117] "RemoveContainer" containerID="450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.194203 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6"} err="failed to get container status \"450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6\": rpc error: code = NotFound desc = could not find container \"450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6\": container with ID starting with 450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6 not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.194226 5036 scope.go:117] "RemoveContainer" containerID="8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.194397 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31"} err="failed to get container status \"8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31\": rpc error: code = NotFound desc = could not find container \"8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31\": container with ID starting with 8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31 not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.194417 5036 scope.go:117] "RemoveContainer" containerID="e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.194654 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f"} err="failed to get container status \"e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f\": rpc error: code = NotFound desc = could not find container \"e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f\": container with ID starting with e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.194692 5036 scope.go:117] "RemoveContainer" containerID="d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.194967 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402"} err="failed to get container status \"d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402\": rpc error: code = NotFound desc = could not find container \"d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402\": container with ID starting with d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402 not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.195017 5036 scope.go:117] "RemoveContainer" containerID="2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.195341 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1"} err="failed to get container status \"2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1\": rpc error: code = NotFound desc = could not find container \"2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1\": container with ID starting with 2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1 not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.195365 5036 scope.go:117] "RemoveContainer" containerID="12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.195552 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e"} err="failed to get container status \"12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e\": rpc error: code = NotFound desc = could not find container \"12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e\": container with ID starting with 12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.195579 5036 scope.go:117] "RemoveContainer" containerID="a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.195838 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a"} err="failed to get container status \"a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a\": rpc error: code = NotFound desc = could not find container \"a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a\": container with ID starting with a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.195857 5036 scope.go:117] "RemoveContainer" containerID="895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.196255 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c"} err="failed to get container status \"895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c\": rpc error: code = NotFound desc = could not find container \"895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c\": container with ID starting with 895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.196282 5036 scope.go:117] "RemoveContainer" containerID="52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.197728 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c"} err="failed to get container status \"52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c\": rpc error: code = NotFound desc = could not find container \"52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c\": container with ID starting with 52fd93b68166635549b23fa93cb17073cb12df45c130067d07dc65f7bf8c871c not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.197754 5036 scope.go:117] "RemoveContainer" containerID="450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.198172 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6"} err="failed to get container status \"450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6\": rpc error: code = NotFound desc = could not find container \"450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6\": container with ID starting with 450c193b09eb5beb5e7a154355ccf4e3a12937afcd02c8b33d6458b5079966a6 not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.198194 5036 scope.go:117] "RemoveContainer" containerID="8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.198472 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31"} err="failed to get container status \"8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31\": rpc error: code = NotFound desc = could not find container \"8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31\": container with ID starting with 8597499503dcce134093263581a787f821092247fe748e9342a9a1b47724ce31 not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.198493 5036 scope.go:117] "RemoveContainer" containerID="e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.198851 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f"} err="failed to get container status \"e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f\": rpc error: code = NotFound desc = could not find container \"e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f\": container with ID starting with e1948bf782204c50e6df633e6096479e32c700c8dac0ddaed4b2cb718722d50f not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.198868 5036 scope.go:117] "RemoveContainer" containerID="d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.199085 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402"} err="failed to get container status \"d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402\": rpc error: code = NotFound desc = could not find container \"d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402\": container with ID starting with d20aeec53e7b91e3c9f238375e9c80e3ee747d2611488877ca9616778b352402 not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.199106 5036 scope.go:117] "RemoveContainer" containerID="2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.199323 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1"} err="failed to get container status \"2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1\": rpc error: code = NotFound desc = could not find container \"2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1\": container with ID starting with 2647463e98b6bc089b4309d5af540eea1791975e73b20532f8aff857a2632fe1 not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.199341 5036 scope.go:117] "RemoveContainer" containerID="12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.199559 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e"} err="failed to get container status \"12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e\": rpc error: code = NotFound desc = could not find container \"12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e\": container with ID starting with 12cbf04109c5a974a3dce34d4dac4bb248ec8df0e221ecfb647cd5e56e77285e not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.199574 5036 scope.go:117] "RemoveContainer" containerID="a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.199915 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a"} err="failed to get container status \"a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a\": rpc error: code = NotFound desc = could not find container \"a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a\": container with ID starting with a97758c24afffe4095b1bc69fe676141e966c12f9f59dc0016e56d1eb368887a not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.199933 5036 scope.go:117] "RemoveContainer" containerID="895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.200169 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c"} err="failed to get container status \"895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c\": rpc error: code = NotFound desc = could not find container \"895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c\": container with ID starting with 895b61f1ae86281c8f9341ee56636c930a6b2b6dde13cd6ce6b8ea15249b806c not found: ID does not exist" Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.210945 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:50 crc kubenswrapper[5036]: W0110 16:36:50.226412 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dd21cae_461f_418c_9a30_2e23ba28c555.slice/crio-a744f4dbad50e01628e42f7f5e93c72bfdcd29a7945b59599469677c1ff1b19d WatchSource:0}: Error finding container a744f4dbad50e01628e42f7f5e93c72bfdcd29a7945b59599469677c1ff1b19d: Status 404 returned error can't find the container with id a744f4dbad50e01628e42f7f5e93c72bfdcd29a7945b59599469677c1ff1b19d Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.349396 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-c4vw5"] Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.352881 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-c4vw5"] Jan 10 16:36:50 crc kubenswrapper[5036]: I0110 16:36:50.516565 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98fa8c41-2298-4b26-849a-806cc77bcc40" path="/var/lib/kubelet/pods/98fa8c41-2298-4b26-849a-806cc77bcc40/volumes" Jan 10 16:36:51 crc kubenswrapper[5036]: I0110 16:36:51.014447 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-44nd6_91a78516-865b-40eb-8545-8f24206fe927/kube-multus/0.log" Jan 10 16:36:51 crc kubenswrapper[5036]: I0110 16:36:51.014608 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-44nd6" event={"ID":"91a78516-865b-40eb-8545-8f24206fe927","Type":"ContainerStarted","Data":"c736aa8edab3fa9f170c88697090db0970490217f6d275c3a5dad50672e65e71"} Jan 10 16:36:51 crc kubenswrapper[5036]: I0110 16:36:51.019573 5036 generic.go:334] "Generic (PLEG): container finished" podID="2dd21cae-461f-418c-9a30-2e23ba28c555" containerID="5251318ad2f94417c7e3062bc6a67849dfb777ee58ec94991e2742db925124f2" exitCode=0 Jan 10 16:36:51 crc kubenswrapper[5036]: I0110 16:36:51.019578 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" event={"ID":"2dd21cae-461f-418c-9a30-2e23ba28c555","Type":"ContainerDied","Data":"5251318ad2f94417c7e3062bc6a67849dfb777ee58ec94991e2742db925124f2"} Jan 10 16:36:51 crc kubenswrapper[5036]: I0110 16:36:51.019752 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" event={"ID":"2dd21cae-461f-418c-9a30-2e23ba28c555","Type":"ContainerStarted","Data":"a744f4dbad50e01628e42f7f5e93c72bfdcd29a7945b59599469677c1ff1b19d"} Jan 10 16:36:52 crc kubenswrapper[5036]: I0110 16:36:52.029780 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" event={"ID":"2dd21cae-461f-418c-9a30-2e23ba28c555","Type":"ContainerStarted","Data":"98562b6bdca886cd9ef56a6d894094505aa74d1cc4a8a29c122c3f24f8966de9"} Jan 10 16:36:52 crc kubenswrapper[5036]: I0110 16:36:52.030431 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" event={"ID":"2dd21cae-461f-418c-9a30-2e23ba28c555","Type":"ContainerStarted","Data":"4a30727239f14d8665a5eb4fdee39781c8caefd1bf87f32ec131e70f04bd5ee3"} Jan 10 16:36:52 crc kubenswrapper[5036]: I0110 16:36:52.030450 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" event={"ID":"2dd21cae-461f-418c-9a30-2e23ba28c555","Type":"ContainerStarted","Data":"2f67d1d8b5d8649aa2fcc824d15f23fb25ff921f9740e153e6f83d6b3ffc25d6"} Jan 10 16:36:52 crc kubenswrapper[5036]: I0110 16:36:52.030464 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" event={"ID":"2dd21cae-461f-418c-9a30-2e23ba28c555","Type":"ContainerStarted","Data":"13a0659aea2feb8baf333bdc845fb905430f8a9bbf46317b3294accfff041a9f"} Jan 10 16:36:52 crc kubenswrapper[5036]: I0110 16:36:52.030479 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" event={"ID":"2dd21cae-461f-418c-9a30-2e23ba28c555","Type":"ContainerStarted","Data":"133c1c24f96294d77b93854c65b7b49a85d8b23acd812a1ebbf0c2ba62473389"} Jan 10 16:36:52 crc kubenswrapper[5036]: I0110 16:36:52.030492 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" event={"ID":"2dd21cae-461f-418c-9a30-2e23ba28c555","Type":"ContainerStarted","Data":"2ffe4f6d99f2f5d0167dd21bdae64e054f5082b3d31e55c89bbcef51c09039c1"} Jan 10 16:36:55 crc kubenswrapper[5036]: I0110 16:36:55.056380 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" event={"ID":"2dd21cae-461f-418c-9a30-2e23ba28c555","Type":"ContainerStarted","Data":"73a1169640ea16b62d14199f7662fcd2a21885c5b8218e6ba440024e3ddf6e8f"} Jan 10 16:36:58 crc kubenswrapper[5036]: I0110 16:36:58.076473 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" event={"ID":"2dd21cae-461f-418c-9a30-2e23ba28c555","Type":"ContainerStarted","Data":"4f1e700244b28d04ff5f4f3fc67587b4cb0616786ca196deddc818ac87d88dc1"} Jan 10 16:36:58 crc kubenswrapper[5036]: I0110 16:36:58.077193 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:58 crc kubenswrapper[5036]: I0110 16:36:58.077212 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:58 crc kubenswrapper[5036]: I0110 16:36:58.104439 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:58 crc kubenswrapper[5036]: I0110 16:36:58.108519 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" podStartSLOduration=9.108500199 podStartE2EDuration="9.108500199s" podCreationTimestamp="2026-01-10 16:36:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:36:58.104116598 +0000 UTC m=+539.974352102" watchObservedRunningTime="2026-01-10 16:36:58.108500199 +0000 UTC m=+539.978735703" Jan 10 16:36:58 crc kubenswrapper[5036]: I0110 16:36:58.887753 5036 scope.go:117] "RemoveContainer" containerID="d4c1a31617a2e80a8e6151e973246c8b82778a5371b775a7afcd70097d2b5b00" Jan 10 16:36:59 crc kubenswrapper[5036]: I0110 16:36:59.084449 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:36:59 crc kubenswrapper[5036]: I0110 16:36:59.118537 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:37:20 crc kubenswrapper[5036]: I0110 16:37:20.249346 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hfkvk" Jan 10 16:37:25 crc kubenswrapper[5036]: I0110 16:37:25.903834 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 16:37:25 crc kubenswrapper[5036]: I0110 16:37:25.904495 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 16:37:32 crc kubenswrapper[5036]: I0110 16:37:32.195787 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4"] Jan 10 16:37:32 crc kubenswrapper[5036]: I0110 16:37:32.198058 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4" Jan 10 16:37:32 crc kubenswrapper[5036]: I0110 16:37:32.200855 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 10 16:37:32 crc kubenswrapper[5036]: I0110 16:37:32.212987 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4"] Jan 10 16:37:32 crc kubenswrapper[5036]: I0110 16:37:32.392286 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9j2f\" (UniqueName: \"kubernetes.io/projected/ccb8fe79-0985-4f47-9885-cb6561c44e59-kube-api-access-w9j2f\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4\" (UID: \"ccb8fe79-0985-4f47-9885-cb6561c44e59\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4" Jan 10 16:37:32 crc kubenswrapper[5036]: I0110 16:37:32.392340 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ccb8fe79-0985-4f47-9885-cb6561c44e59-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4\" (UID: \"ccb8fe79-0985-4f47-9885-cb6561c44e59\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4" Jan 10 16:37:32 crc kubenswrapper[5036]: I0110 16:37:32.392426 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ccb8fe79-0985-4f47-9885-cb6561c44e59-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4\" (UID: \"ccb8fe79-0985-4f47-9885-cb6561c44e59\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4" Jan 10 16:37:32 crc kubenswrapper[5036]: I0110 16:37:32.493337 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9j2f\" (UniqueName: \"kubernetes.io/projected/ccb8fe79-0985-4f47-9885-cb6561c44e59-kube-api-access-w9j2f\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4\" (UID: \"ccb8fe79-0985-4f47-9885-cb6561c44e59\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4" Jan 10 16:37:32 crc kubenswrapper[5036]: I0110 16:37:32.493399 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ccb8fe79-0985-4f47-9885-cb6561c44e59-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4\" (UID: \"ccb8fe79-0985-4f47-9885-cb6561c44e59\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4" Jan 10 16:37:32 crc kubenswrapper[5036]: I0110 16:37:32.493438 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ccb8fe79-0985-4f47-9885-cb6561c44e59-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4\" (UID: \"ccb8fe79-0985-4f47-9885-cb6561c44e59\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4" Jan 10 16:37:32 crc kubenswrapper[5036]: I0110 16:37:32.493988 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ccb8fe79-0985-4f47-9885-cb6561c44e59-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4\" (UID: \"ccb8fe79-0985-4f47-9885-cb6561c44e59\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4" Jan 10 16:37:32 crc kubenswrapper[5036]: I0110 16:37:32.494243 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ccb8fe79-0985-4f47-9885-cb6561c44e59-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4\" (UID: \"ccb8fe79-0985-4f47-9885-cb6561c44e59\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4" Jan 10 16:37:32 crc kubenswrapper[5036]: I0110 16:37:32.527246 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9j2f\" (UniqueName: \"kubernetes.io/projected/ccb8fe79-0985-4f47-9885-cb6561c44e59-kube-api-access-w9j2f\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4\" (UID: \"ccb8fe79-0985-4f47-9885-cb6561c44e59\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4" Jan 10 16:37:32 crc kubenswrapper[5036]: I0110 16:37:32.825652 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4" Jan 10 16:37:33 crc kubenswrapper[5036]: I0110 16:37:33.300287 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4"] Jan 10 16:37:33 crc kubenswrapper[5036]: I0110 16:37:33.324583 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4" event={"ID":"ccb8fe79-0985-4f47-9885-cb6561c44e59","Type":"ContainerStarted","Data":"89aad951d0806b13803f5de5950dad5769dba4f04465eb43a99fdae53fe7297e"} Jan 10 16:37:34 crc kubenswrapper[5036]: I0110 16:37:34.332381 5036 generic.go:334] "Generic (PLEG): container finished" podID="ccb8fe79-0985-4f47-9885-cb6561c44e59" containerID="3206d5a7984c4eb8fe66223b010ceeef5f5958e147ef8703b174c5e81670a5b7" exitCode=0 Jan 10 16:37:34 crc kubenswrapper[5036]: I0110 16:37:34.332494 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4" event={"ID":"ccb8fe79-0985-4f47-9885-cb6561c44e59","Type":"ContainerDied","Data":"3206d5a7984c4eb8fe66223b010ceeef5f5958e147ef8703b174c5e81670a5b7"} Jan 10 16:37:36 crc kubenswrapper[5036]: I0110 16:37:36.348820 5036 generic.go:334] "Generic (PLEG): container finished" podID="ccb8fe79-0985-4f47-9885-cb6561c44e59" containerID="4113902d333b7a200b8969a7d4fd28f4f7544fee3432232a1097f2428320dbdd" exitCode=0 Jan 10 16:37:36 crc kubenswrapper[5036]: I0110 16:37:36.348887 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4" event={"ID":"ccb8fe79-0985-4f47-9885-cb6561c44e59","Type":"ContainerDied","Data":"4113902d333b7a200b8969a7d4fd28f4f7544fee3432232a1097f2428320dbdd"} Jan 10 16:37:37 crc kubenswrapper[5036]: I0110 16:37:37.362572 5036 generic.go:334] "Generic (PLEG): container finished" podID="ccb8fe79-0985-4f47-9885-cb6561c44e59" containerID="b63758b52e08cc39957207b5395f5cb5edb7d72576ddf8d9556c8082a2c6adfa" exitCode=0 Jan 10 16:37:37 crc kubenswrapper[5036]: I0110 16:37:37.362648 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4" event={"ID":"ccb8fe79-0985-4f47-9885-cb6561c44e59","Type":"ContainerDied","Data":"b63758b52e08cc39957207b5395f5cb5edb7d72576ddf8d9556c8082a2c6adfa"} Jan 10 16:37:38 crc kubenswrapper[5036]: I0110 16:37:38.728976 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4" Jan 10 16:37:38 crc kubenswrapper[5036]: I0110 16:37:38.885344 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9j2f\" (UniqueName: \"kubernetes.io/projected/ccb8fe79-0985-4f47-9885-cb6561c44e59-kube-api-access-w9j2f\") pod \"ccb8fe79-0985-4f47-9885-cb6561c44e59\" (UID: \"ccb8fe79-0985-4f47-9885-cb6561c44e59\") " Jan 10 16:37:38 crc kubenswrapper[5036]: I0110 16:37:38.885474 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ccb8fe79-0985-4f47-9885-cb6561c44e59-util\") pod \"ccb8fe79-0985-4f47-9885-cb6561c44e59\" (UID: \"ccb8fe79-0985-4f47-9885-cb6561c44e59\") " Jan 10 16:37:38 crc kubenswrapper[5036]: I0110 16:37:38.885522 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ccb8fe79-0985-4f47-9885-cb6561c44e59-bundle\") pod \"ccb8fe79-0985-4f47-9885-cb6561c44e59\" (UID: \"ccb8fe79-0985-4f47-9885-cb6561c44e59\") " Jan 10 16:37:38 crc kubenswrapper[5036]: I0110 16:37:38.887645 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccb8fe79-0985-4f47-9885-cb6561c44e59-bundle" (OuterVolumeSpecName: "bundle") pod "ccb8fe79-0985-4f47-9885-cb6561c44e59" (UID: "ccb8fe79-0985-4f47-9885-cb6561c44e59"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:37:38 crc kubenswrapper[5036]: I0110 16:37:38.897999 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccb8fe79-0985-4f47-9885-cb6561c44e59-kube-api-access-w9j2f" (OuterVolumeSpecName: "kube-api-access-w9j2f") pod "ccb8fe79-0985-4f47-9885-cb6561c44e59" (UID: "ccb8fe79-0985-4f47-9885-cb6561c44e59"). InnerVolumeSpecName "kube-api-access-w9j2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:37:38 crc kubenswrapper[5036]: I0110 16:37:38.987609 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9j2f\" (UniqueName: \"kubernetes.io/projected/ccb8fe79-0985-4f47-9885-cb6561c44e59-kube-api-access-w9j2f\") on node \"crc\" DevicePath \"\"" Jan 10 16:37:38 crc kubenswrapper[5036]: I0110 16:37:38.987647 5036 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ccb8fe79-0985-4f47-9885-cb6561c44e59-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:37:39 crc kubenswrapper[5036]: I0110 16:37:39.204069 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccb8fe79-0985-4f47-9885-cb6561c44e59-util" (OuterVolumeSpecName: "util") pod "ccb8fe79-0985-4f47-9885-cb6561c44e59" (UID: "ccb8fe79-0985-4f47-9885-cb6561c44e59"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:37:39 crc kubenswrapper[5036]: I0110 16:37:39.290532 5036 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ccb8fe79-0985-4f47-9885-cb6561c44e59-util\") on node \"crc\" DevicePath \"\"" Jan 10 16:37:39 crc kubenswrapper[5036]: I0110 16:37:39.379877 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4" event={"ID":"ccb8fe79-0985-4f47-9885-cb6561c44e59","Type":"ContainerDied","Data":"89aad951d0806b13803f5de5950dad5769dba4f04465eb43a99fdae53fe7297e"} Jan 10 16:37:39 crc kubenswrapper[5036]: I0110 16:37:39.379940 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89aad951d0806b13803f5de5950dad5769dba4f04465eb43a99fdae53fe7297e" Jan 10 16:37:39 crc kubenswrapper[5036]: I0110 16:37:39.379959 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4" Jan 10 16:37:41 crc kubenswrapper[5036]: I0110 16:37:41.210376 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-bnf7g"] Jan 10 16:37:41 crc kubenswrapper[5036]: E0110 16:37:41.210902 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb8fe79-0985-4f47-9885-cb6561c44e59" containerName="pull" Jan 10 16:37:41 crc kubenswrapper[5036]: I0110 16:37:41.210917 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb8fe79-0985-4f47-9885-cb6561c44e59" containerName="pull" Jan 10 16:37:41 crc kubenswrapper[5036]: E0110 16:37:41.210927 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb8fe79-0985-4f47-9885-cb6561c44e59" containerName="extract" Jan 10 16:37:41 crc kubenswrapper[5036]: I0110 16:37:41.210935 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb8fe79-0985-4f47-9885-cb6561c44e59" containerName="extract" Jan 10 16:37:41 crc kubenswrapper[5036]: E0110 16:37:41.210951 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb8fe79-0985-4f47-9885-cb6561c44e59" containerName="util" Jan 10 16:37:41 crc kubenswrapper[5036]: I0110 16:37:41.210962 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb8fe79-0985-4f47-9885-cb6561c44e59" containerName="util" Jan 10 16:37:41 crc kubenswrapper[5036]: I0110 16:37:41.211096 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccb8fe79-0985-4f47-9885-cb6561c44e59" containerName="extract" Jan 10 16:37:41 crc kubenswrapper[5036]: I0110 16:37:41.211510 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6769fb99d-bnf7g" Jan 10 16:37:41 crc kubenswrapper[5036]: I0110 16:37:41.213940 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 10 16:37:41 crc kubenswrapper[5036]: I0110 16:37:41.214337 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 10 16:37:41 crc kubenswrapper[5036]: I0110 16:37:41.220973 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-ljvdv" Jan 10 16:37:41 crc kubenswrapper[5036]: I0110 16:37:41.221052 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-bnf7g"] Jan 10 16:37:41 crc kubenswrapper[5036]: I0110 16:37:41.317077 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdkhd\" (UniqueName: \"kubernetes.io/projected/d8791ab9-ee3b-4af7-98d5-2bc06f5d863a-kube-api-access-cdkhd\") pod \"nmstate-operator-6769fb99d-bnf7g\" (UID: \"d8791ab9-ee3b-4af7-98d5-2bc06f5d863a\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-bnf7g" Jan 10 16:37:41 crc kubenswrapper[5036]: I0110 16:37:41.418023 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdkhd\" (UniqueName: \"kubernetes.io/projected/d8791ab9-ee3b-4af7-98d5-2bc06f5d863a-kube-api-access-cdkhd\") pod \"nmstate-operator-6769fb99d-bnf7g\" (UID: \"d8791ab9-ee3b-4af7-98d5-2bc06f5d863a\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-bnf7g" Jan 10 16:37:41 crc kubenswrapper[5036]: I0110 16:37:41.448762 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdkhd\" (UniqueName: \"kubernetes.io/projected/d8791ab9-ee3b-4af7-98d5-2bc06f5d863a-kube-api-access-cdkhd\") pod \"nmstate-operator-6769fb99d-bnf7g\" (UID: \"d8791ab9-ee3b-4af7-98d5-2bc06f5d863a\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-bnf7g" Jan 10 16:37:41 crc kubenswrapper[5036]: I0110 16:37:41.536843 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6769fb99d-bnf7g" Jan 10 16:37:41 crc kubenswrapper[5036]: I0110 16:37:41.733432 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-bnf7g"] Jan 10 16:37:41 crc kubenswrapper[5036]: W0110 16:37:41.744374 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8791ab9_ee3b_4af7_98d5_2bc06f5d863a.slice/crio-f29291d5071579f9318d2e8348f93972b6e13aaa92f7bfb846a0cba8c54aee0e WatchSource:0}: Error finding container f29291d5071579f9318d2e8348f93972b6e13aaa92f7bfb846a0cba8c54aee0e: Status 404 returned error can't find the container with id f29291d5071579f9318d2e8348f93972b6e13aaa92f7bfb846a0cba8c54aee0e Jan 10 16:37:42 crc kubenswrapper[5036]: I0110 16:37:42.398587 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6769fb99d-bnf7g" event={"ID":"d8791ab9-ee3b-4af7-98d5-2bc06f5d863a","Type":"ContainerStarted","Data":"f29291d5071579f9318d2e8348f93972b6e13aaa92f7bfb846a0cba8c54aee0e"} Jan 10 16:37:44 crc kubenswrapper[5036]: I0110 16:37:44.416811 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6769fb99d-bnf7g" event={"ID":"d8791ab9-ee3b-4af7-98d5-2bc06f5d863a","Type":"ContainerStarted","Data":"5fbd2a58c75fb0d59ca643fc57ba0e1a5ad3d99c36f16bf7ed0da54b25d3c05a"} Jan 10 16:37:44 crc kubenswrapper[5036]: I0110 16:37:44.437218 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-6769fb99d-bnf7g" podStartSLOduration=1.2905258179999999 podStartE2EDuration="3.43719499s" podCreationTimestamp="2026-01-10 16:37:41 +0000 UTC" firstStartedPulling="2026-01-10 16:37:41.750049263 +0000 UTC m=+583.620284757" lastFinishedPulling="2026-01-10 16:37:43.896718435 +0000 UTC m=+585.766953929" observedRunningTime="2026-01-10 16:37:44.434990674 +0000 UTC m=+586.305226168" watchObservedRunningTime="2026-01-10 16:37:44.43719499 +0000 UTC m=+586.307430514" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.370243 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-f826c"] Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.371336 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-f826c" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.372920 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-mrxlj" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.382640 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-cdrlk"] Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.383329 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-f8fb84555-cdrlk" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.387220 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.399994 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-f826c"] Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.451061 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-cdrlk"] Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.457696 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-7r9qs"] Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.458955 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-7r9qs" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.491137 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9tmg\" (UniqueName: \"kubernetes.io/projected/d921a9df-835d-4165-ac39-8717cfcf384d-kube-api-access-d9tmg\") pod \"nmstate-metrics-7f7f7578db-f826c\" (UID: \"d921a9df-835d-4165-ac39-8717cfcf384d\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-f826c" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.491183 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e0f63dbf-f65f-4d9a-8cf4-802a41ed012b-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-cdrlk\" (UID: \"e0f63dbf-f65f-4d9a-8cf4-802a41ed012b\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-cdrlk" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.491249 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bc46\" (UniqueName: \"kubernetes.io/projected/e0f63dbf-f65f-4d9a-8cf4-802a41ed012b-kube-api-access-8bc46\") pod \"nmstate-webhook-f8fb84555-cdrlk\" (UID: \"e0f63dbf-f65f-4d9a-8cf4-802a41ed012b\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-cdrlk" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.514245 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-577fv"] Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.515114 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-577fv" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.518028 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.518239 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.518385 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-8z6ph" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.526949 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-577fv"] Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.593426 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5abcf259-63b1-44b5-b335-950b101edec4-dbus-socket\") pod \"nmstate-handler-7r9qs\" (UID: \"5abcf259-63b1-44b5-b335-950b101edec4\") " pod="openshift-nmstate/nmstate-handler-7r9qs" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.593499 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5abcf259-63b1-44b5-b335-950b101edec4-nmstate-lock\") pod \"nmstate-handler-7r9qs\" (UID: \"5abcf259-63b1-44b5-b335-950b101edec4\") " pod="openshift-nmstate/nmstate-handler-7r9qs" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.593554 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/31225a12-4366-4224-9bb5-3c8ee635a631-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-577fv\" (UID: \"31225a12-4366-4224-9bb5-3c8ee635a631\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-577fv" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.593602 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bc46\" (UniqueName: \"kubernetes.io/projected/e0f63dbf-f65f-4d9a-8cf4-802a41ed012b-kube-api-access-8bc46\") pod \"nmstate-webhook-f8fb84555-cdrlk\" (UID: \"e0f63dbf-f65f-4d9a-8cf4-802a41ed012b\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-cdrlk" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.593643 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5abcf259-63b1-44b5-b335-950b101edec4-ovs-socket\") pod \"nmstate-handler-7r9qs\" (UID: \"5abcf259-63b1-44b5-b335-950b101edec4\") " pod="openshift-nmstate/nmstate-handler-7r9qs" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.593671 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9tmg\" (UniqueName: \"kubernetes.io/projected/d921a9df-835d-4165-ac39-8717cfcf384d-kube-api-access-d9tmg\") pod \"nmstate-metrics-7f7f7578db-f826c\" (UID: \"d921a9df-835d-4165-ac39-8717cfcf384d\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-f826c" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.593726 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qct9q\" (UniqueName: \"kubernetes.io/projected/5abcf259-63b1-44b5-b335-950b101edec4-kube-api-access-qct9q\") pod \"nmstate-handler-7r9qs\" (UID: \"5abcf259-63b1-44b5-b335-950b101edec4\") " pod="openshift-nmstate/nmstate-handler-7r9qs" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.593751 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e0f63dbf-f65f-4d9a-8cf4-802a41ed012b-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-cdrlk\" (UID: \"e0f63dbf-f65f-4d9a-8cf4-802a41ed012b\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-cdrlk" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.593779 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/31225a12-4366-4224-9bb5-3c8ee635a631-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-577fv\" (UID: \"31225a12-4366-4224-9bb5-3c8ee635a631\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-577fv" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.593817 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnn7j\" (UniqueName: \"kubernetes.io/projected/31225a12-4366-4224-9bb5-3c8ee635a631-kube-api-access-pnn7j\") pod \"nmstate-console-plugin-6ff7998486-577fv\" (UID: \"31225a12-4366-4224-9bb5-3c8ee635a631\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-577fv" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.617451 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e0f63dbf-f65f-4d9a-8cf4-802a41ed012b-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-cdrlk\" (UID: \"e0f63dbf-f65f-4d9a-8cf4-802a41ed012b\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-cdrlk" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.620984 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bc46\" (UniqueName: \"kubernetes.io/projected/e0f63dbf-f65f-4d9a-8cf4-802a41ed012b-kube-api-access-8bc46\") pod \"nmstate-webhook-f8fb84555-cdrlk\" (UID: \"e0f63dbf-f65f-4d9a-8cf4-802a41ed012b\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-cdrlk" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.622503 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9tmg\" (UniqueName: \"kubernetes.io/projected/d921a9df-835d-4165-ac39-8717cfcf384d-kube-api-access-d9tmg\") pod \"nmstate-metrics-7f7f7578db-f826c\" (UID: \"d921a9df-835d-4165-ac39-8717cfcf384d\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-f826c" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.686822 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-f826c" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.694479 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/31225a12-4366-4224-9bb5-3c8ee635a631-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-577fv\" (UID: \"31225a12-4366-4224-9bb5-3c8ee635a631\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-577fv" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.694524 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5abcf259-63b1-44b5-b335-950b101edec4-ovs-socket\") pod \"nmstate-handler-7r9qs\" (UID: \"5abcf259-63b1-44b5-b335-950b101edec4\") " pod="openshift-nmstate/nmstate-handler-7r9qs" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.694547 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qct9q\" (UniqueName: \"kubernetes.io/projected/5abcf259-63b1-44b5-b335-950b101edec4-kube-api-access-qct9q\") pod \"nmstate-handler-7r9qs\" (UID: \"5abcf259-63b1-44b5-b335-950b101edec4\") " pod="openshift-nmstate/nmstate-handler-7r9qs" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.694569 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/31225a12-4366-4224-9bb5-3c8ee635a631-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-577fv\" (UID: \"31225a12-4366-4224-9bb5-3c8ee635a631\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-577fv" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.694591 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnn7j\" (UniqueName: \"kubernetes.io/projected/31225a12-4366-4224-9bb5-3c8ee635a631-kube-api-access-pnn7j\") pod \"nmstate-console-plugin-6ff7998486-577fv\" (UID: \"31225a12-4366-4224-9bb5-3c8ee635a631\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-577fv" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.694628 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5abcf259-63b1-44b5-b335-950b101edec4-dbus-socket\") pod \"nmstate-handler-7r9qs\" (UID: \"5abcf259-63b1-44b5-b335-950b101edec4\") " pod="openshift-nmstate/nmstate-handler-7r9qs" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.694644 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5abcf259-63b1-44b5-b335-950b101edec4-nmstate-lock\") pod \"nmstate-handler-7r9qs\" (UID: \"5abcf259-63b1-44b5-b335-950b101edec4\") " pod="openshift-nmstate/nmstate-handler-7r9qs" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.694765 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5abcf259-63b1-44b5-b335-950b101edec4-nmstate-lock\") pod \"nmstate-handler-7r9qs\" (UID: \"5abcf259-63b1-44b5-b335-950b101edec4\") " pod="openshift-nmstate/nmstate-handler-7r9qs" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.695519 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/31225a12-4366-4224-9bb5-3c8ee635a631-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-577fv\" (UID: \"31225a12-4366-4224-9bb5-3c8ee635a631\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-577fv" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.695935 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5abcf259-63b1-44b5-b335-950b101edec4-ovs-socket\") pod \"nmstate-handler-7r9qs\" (UID: \"5abcf259-63b1-44b5-b335-950b101edec4\") " pod="openshift-nmstate/nmstate-handler-7r9qs" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.695991 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5abcf259-63b1-44b5-b335-950b101edec4-dbus-socket\") pod \"nmstate-handler-7r9qs\" (UID: \"5abcf259-63b1-44b5-b335-950b101edec4\") " pod="openshift-nmstate/nmstate-handler-7r9qs" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.699242 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/31225a12-4366-4224-9bb5-3c8ee635a631-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-577fv\" (UID: \"31225a12-4366-4224-9bb5-3c8ee635a631\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-577fv" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.708120 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d5d894b64-hjm7m"] Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.708180 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-f8fb84555-cdrlk" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.710186 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.727457 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d5d894b64-hjm7m"] Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.747333 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qct9q\" (UniqueName: \"kubernetes.io/projected/5abcf259-63b1-44b5-b335-950b101edec4-kube-api-access-qct9q\") pod \"nmstate-handler-7r9qs\" (UID: \"5abcf259-63b1-44b5-b335-950b101edec4\") " pod="openshift-nmstate/nmstate-handler-7r9qs" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.768803 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnn7j\" (UniqueName: \"kubernetes.io/projected/31225a12-4366-4224-9bb5-3c8ee635a631-kube-api-access-pnn7j\") pod \"nmstate-console-plugin-6ff7998486-577fv\" (UID: \"31225a12-4366-4224-9bb5-3c8ee635a631\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-577fv" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.779434 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-7r9qs" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.796331 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3192a389-aa58-4cd9-8027-9f2b90b1c230-console-config\") pod \"console-5d5d894b64-hjm7m\" (UID: \"3192a389-aa58-4cd9-8027-9f2b90b1c230\") " pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.796758 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4khn\" (UniqueName: \"kubernetes.io/projected/3192a389-aa58-4cd9-8027-9f2b90b1c230-kube-api-access-d4khn\") pod \"console-5d5d894b64-hjm7m\" (UID: \"3192a389-aa58-4cd9-8027-9f2b90b1c230\") " pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.796785 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3192a389-aa58-4cd9-8027-9f2b90b1c230-console-oauth-config\") pod \"console-5d5d894b64-hjm7m\" (UID: \"3192a389-aa58-4cd9-8027-9f2b90b1c230\") " pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.796810 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3192a389-aa58-4cd9-8027-9f2b90b1c230-console-serving-cert\") pod \"console-5d5d894b64-hjm7m\" (UID: \"3192a389-aa58-4cd9-8027-9f2b90b1c230\") " pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.796859 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3192a389-aa58-4cd9-8027-9f2b90b1c230-trusted-ca-bundle\") pod \"console-5d5d894b64-hjm7m\" (UID: \"3192a389-aa58-4cd9-8027-9f2b90b1c230\") " pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.796904 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3192a389-aa58-4cd9-8027-9f2b90b1c230-service-ca\") pod \"console-5d5d894b64-hjm7m\" (UID: \"3192a389-aa58-4cd9-8027-9f2b90b1c230\") " pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.796927 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3192a389-aa58-4cd9-8027-9f2b90b1c230-oauth-serving-cert\") pod \"console-5d5d894b64-hjm7m\" (UID: \"3192a389-aa58-4cd9-8027-9f2b90b1c230\") " pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.832728 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-577fv" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.899129 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3192a389-aa58-4cd9-8027-9f2b90b1c230-console-config\") pod \"console-5d5d894b64-hjm7m\" (UID: \"3192a389-aa58-4cd9-8027-9f2b90b1c230\") " pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.899173 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4khn\" (UniqueName: \"kubernetes.io/projected/3192a389-aa58-4cd9-8027-9f2b90b1c230-kube-api-access-d4khn\") pod \"console-5d5d894b64-hjm7m\" (UID: \"3192a389-aa58-4cd9-8027-9f2b90b1c230\") " pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.899210 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3192a389-aa58-4cd9-8027-9f2b90b1c230-console-oauth-config\") pod \"console-5d5d894b64-hjm7m\" (UID: \"3192a389-aa58-4cd9-8027-9f2b90b1c230\") " pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.899237 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3192a389-aa58-4cd9-8027-9f2b90b1c230-console-serving-cert\") pod \"console-5d5d894b64-hjm7m\" (UID: \"3192a389-aa58-4cd9-8027-9f2b90b1c230\") " pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.899288 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3192a389-aa58-4cd9-8027-9f2b90b1c230-trusted-ca-bundle\") pod \"console-5d5d894b64-hjm7m\" (UID: \"3192a389-aa58-4cd9-8027-9f2b90b1c230\") " pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.899334 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3192a389-aa58-4cd9-8027-9f2b90b1c230-service-ca\") pod \"console-5d5d894b64-hjm7m\" (UID: \"3192a389-aa58-4cd9-8027-9f2b90b1c230\") " pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.899355 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3192a389-aa58-4cd9-8027-9f2b90b1c230-oauth-serving-cert\") pod \"console-5d5d894b64-hjm7m\" (UID: \"3192a389-aa58-4cd9-8027-9f2b90b1c230\") " pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.900529 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3192a389-aa58-4cd9-8027-9f2b90b1c230-oauth-serving-cert\") pod \"console-5d5d894b64-hjm7m\" (UID: \"3192a389-aa58-4cd9-8027-9f2b90b1c230\") " pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.900669 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3192a389-aa58-4cd9-8027-9f2b90b1c230-console-config\") pod \"console-5d5d894b64-hjm7m\" (UID: \"3192a389-aa58-4cd9-8027-9f2b90b1c230\") " pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.901326 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3192a389-aa58-4cd9-8027-9f2b90b1c230-trusted-ca-bundle\") pod \"console-5d5d894b64-hjm7m\" (UID: \"3192a389-aa58-4cd9-8027-9f2b90b1c230\") " pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.901620 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3192a389-aa58-4cd9-8027-9f2b90b1c230-service-ca\") pod \"console-5d5d894b64-hjm7m\" (UID: \"3192a389-aa58-4cd9-8027-9f2b90b1c230\") " pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.906412 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3192a389-aa58-4cd9-8027-9f2b90b1c230-console-serving-cert\") pod \"console-5d5d894b64-hjm7m\" (UID: \"3192a389-aa58-4cd9-8027-9f2b90b1c230\") " pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.912859 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3192a389-aa58-4cd9-8027-9f2b90b1c230-console-oauth-config\") pod \"console-5d5d894b64-hjm7m\" (UID: \"3192a389-aa58-4cd9-8027-9f2b90b1c230\") " pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:37:52 crc kubenswrapper[5036]: I0110 16:37:52.915940 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4khn\" (UniqueName: \"kubernetes.io/projected/3192a389-aa58-4cd9-8027-9f2b90b1c230-kube-api-access-d4khn\") pod \"console-5d5d894b64-hjm7m\" (UID: \"3192a389-aa58-4cd9-8027-9f2b90b1c230\") " pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:37:53 crc kubenswrapper[5036]: I0110 16:37:53.006867 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-577fv"] Jan 10 16:37:53 crc kubenswrapper[5036]: W0110 16:37:53.013658 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31225a12_4366_4224_9bb5_3c8ee635a631.slice/crio-f884c6f9cbf473a0b41bca82a2bb2633f9ceea29ee8265c3fe38c7a59a8eb8db WatchSource:0}: Error finding container f884c6f9cbf473a0b41bca82a2bb2633f9ceea29ee8265c3fe38c7a59a8eb8db: Status 404 returned error can't find the container with id f884c6f9cbf473a0b41bca82a2bb2633f9ceea29ee8265c3fe38c7a59a8eb8db Jan 10 16:37:53 crc kubenswrapper[5036]: I0110 16:37:53.069455 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:37:53 crc kubenswrapper[5036]: I0110 16:37:53.185276 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-cdrlk"] Jan 10 16:37:53 crc kubenswrapper[5036]: I0110 16:37:53.192053 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-f826c"] Jan 10 16:37:53 crc kubenswrapper[5036]: I0110 16:37:53.266083 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d5d894b64-hjm7m"] Jan 10 16:37:53 crc kubenswrapper[5036]: W0110 16:37:53.274774 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3192a389_aa58_4cd9_8027_9f2b90b1c230.slice/crio-2398f57eccdfdb174126bfcb27fc00be387db84c4ba986676e02a98149be5431 WatchSource:0}: Error finding container 2398f57eccdfdb174126bfcb27fc00be387db84c4ba986676e02a98149be5431: Status 404 returned error can't find the container with id 2398f57eccdfdb174126bfcb27fc00be387db84c4ba986676e02a98149be5431 Jan 10 16:37:53 crc kubenswrapper[5036]: I0110 16:37:53.479543 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d5d894b64-hjm7m" event={"ID":"3192a389-aa58-4cd9-8027-9f2b90b1c230","Type":"ContainerStarted","Data":"9d7db80cae1aa401a42e54473a31b8b3e78d35e9cb6c91aba5575dfa3d092f99"} Jan 10 16:37:53 crc kubenswrapper[5036]: I0110 16:37:53.479585 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d5d894b64-hjm7m" event={"ID":"3192a389-aa58-4cd9-8027-9f2b90b1c230","Type":"ContainerStarted","Data":"2398f57eccdfdb174126bfcb27fc00be387db84c4ba986676e02a98149be5431"} Jan 10 16:37:53 crc kubenswrapper[5036]: I0110 16:37:53.484831 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-7r9qs" event={"ID":"5abcf259-63b1-44b5-b335-950b101edec4","Type":"ContainerStarted","Data":"1da9ab5e0a22fc29fa4e8ff414db6fc0815bb6c2a051e6baf96cd7b476f5f80d"} Jan 10 16:37:53 crc kubenswrapper[5036]: I0110 16:37:53.493469 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-f826c" event={"ID":"d921a9df-835d-4165-ac39-8717cfcf384d","Type":"ContainerStarted","Data":"16cfaf9af6deafc6613f5c74b04efd1551fc595d77f395f26e63e2b30ac35f57"} Jan 10 16:37:53 crc kubenswrapper[5036]: I0110 16:37:53.494878 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-f8fb84555-cdrlk" event={"ID":"e0f63dbf-f65f-4d9a-8cf4-802a41ed012b","Type":"ContainerStarted","Data":"ea75886056e8a5f6c8b24bb2c3dd9e3d228ce289e5a26ab4e88e535b776653dc"} Jan 10 16:37:53 crc kubenswrapper[5036]: I0110 16:37:53.496445 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-577fv" event={"ID":"31225a12-4366-4224-9bb5-3c8ee635a631","Type":"ContainerStarted","Data":"f884c6f9cbf473a0b41bca82a2bb2633f9ceea29ee8265c3fe38c7a59a8eb8db"} Jan 10 16:37:53 crc kubenswrapper[5036]: I0110 16:37:53.503977 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d5d894b64-hjm7m" podStartSLOduration=1.503949717 podStartE2EDuration="1.503949717s" podCreationTimestamp="2026-01-10 16:37:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:37:53.501027887 +0000 UTC m=+595.371263441" watchObservedRunningTime="2026-01-10 16:37:53.503949717 +0000 UTC m=+595.374185251" Jan 10 16:37:55 crc kubenswrapper[5036]: I0110 16:37:55.903868 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 16:37:55 crc kubenswrapper[5036]: I0110 16:37:55.904352 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 16:37:56 crc kubenswrapper[5036]: I0110 16:37:56.521631 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-7r9qs" event={"ID":"5abcf259-63b1-44b5-b335-950b101edec4","Type":"ContainerStarted","Data":"c3746d6a69dad7b303e7a2974ea52ade19867545420e03ae465effed26ac2e32"} Jan 10 16:37:56 crc kubenswrapper[5036]: I0110 16:37:56.523840 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-f826c" event={"ID":"d921a9df-835d-4165-ac39-8717cfcf384d","Type":"ContainerStarted","Data":"d27c562cfc0b5cbb3f6cdc943d5425fa9b6841029617a535619544b7e8c60209"} Jan 10 16:37:56 crc kubenswrapper[5036]: I0110 16:37:56.526027 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-f8fb84555-cdrlk" event={"ID":"e0f63dbf-f65f-4d9a-8cf4-802a41ed012b","Type":"ContainerStarted","Data":"9b36491fbed1dd0339139261371495ee96d104a4b821b55eed31f387323a740e"} Jan 10 16:37:56 crc kubenswrapper[5036]: I0110 16:37:56.526133 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-f8fb84555-cdrlk" Jan 10 16:37:56 crc kubenswrapper[5036]: I0110 16:37:56.528113 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-577fv" event={"ID":"31225a12-4366-4224-9bb5-3c8ee635a631","Type":"ContainerStarted","Data":"b132bfcf24042585f9b429879d001f1bab4804cd0c1b5e247849a8898b68cc06"} Jan 10 16:37:56 crc kubenswrapper[5036]: I0110 16:37:56.546051 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-7r9qs" podStartSLOduration=1.5741649789999999 podStartE2EDuration="4.546018998s" podCreationTimestamp="2026-01-10 16:37:52 +0000 UTC" firstStartedPulling="2026-01-10 16:37:52.826597232 +0000 UTC m=+594.696832726" lastFinishedPulling="2026-01-10 16:37:55.798451251 +0000 UTC m=+597.668686745" observedRunningTime="2026-01-10 16:37:56.539973631 +0000 UTC m=+598.410209145" watchObservedRunningTime="2026-01-10 16:37:56.546018998 +0000 UTC m=+598.416254512" Jan 10 16:37:56 crc kubenswrapper[5036]: I0110 16:37:56.560901 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-577fv" podStartSLOduration=1.795095728 podStartE2EDuration="4.560860969s" podCreationTimestamp="2026-01-10 16:37:52 +0000 UTC" firstStartedPulling="2026-01-10 16:37:53.016515773 +0000 UTC m=+594.886751257" lastFinishedPulling="2026-01-10 16:37:55.782281004 +0000 UTC m=+597.652516498" observedRunningTime="2026-01-10 16:37:56.55800609 +0000 UTC m=+598.428241584" watchObservedRunningTime="2026-01-10 16:37:56.560860969 +0000 UTC m=+598.431096463" Jan 10 16:37:56 crc kubenswrapper[5036]: I0110 16:37:56.577160 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-f8fb84555-cdrlk" podStartSLOduration=1.932325482 podStartE2EDuration="4.577142469s" podCreationTimestamp="2026-01-10 16:37:52 +0000 UTC" firstStartedPulling="2026-01-10 16:37:53.182100911 +0000 UTC m=+595.052336405" lastFinishedPulling="2026-01-10 16:37:55.826917878 +0000 UTC m=+597.697153392" observedRunningTime="2026-01-10 16:37:56.575336019 +0000 UTC m=+598.445571523" watchObservedRunningTime="2026-01-10 16:37:56.577142469 +0000 UTC m=+598.447377963" Jan 10 16:37:57 crc kubenswrapper[5036]: I0110 16:37:57.538012 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-7r9qs" Jan 10 16:37:58 crc kubenswrapper[5036]: I0110 16:37:58.543554 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-f826c" event={"ID":"d921a9df-835d-4165-ac39-8717cfcf384d","Type":"ContainerStarted","Data":"f4202522cc37c8efb163dbc306311b815c094779f202fd282b03602bbe2c444b"} Jan 10 16:37:58 crc kubenswrapper[5036]: I0110 16:37:58.564119 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-f826c" podStartSLOduration=1.444696331 podStartE2EDuration="6.564089529s" podCreationTimestamp="2026-01-10 16:37:52 +0000 UTC" firstStartedPulling="2026-01-10 16:37:53.188341423 +0000 UTC m=+595.058576917" lastFinishedPulling="2026-01-10 16:37:58.307734591 +0000 UTC m=+600.177970115" observedRunningTime="2026-01-10 16:37:58.559555353 +0000 UTC m=+600.429790887" watchObservedRunningTime="2026-01-10 16:37:58.564089529 +0000 UTC m=+600.434325073" Jan 10 16:38:02 crc kubenswrapper[5036]: I0110 16:38:02.824406 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-7r9qs" Jan 10 16:38:03 crc kubenswrapper[5036]: I0110 16:38:03.070409 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:38:03 crc kubenswrapper[5036]: I0110 16:38:03.070646 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:38:03 crc kubenswrapper[5036]: I0110 16:38:03.078642 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:38:03 crc kubenswrapper[5036]: I0110 16:38:03.593315 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d5d894b64-hjm7m" Jan 10 16:38:03 crc kubenswrapper[5036]: I0110 16:38:03.667932 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-bvg6n"] Jan 10 16:38:12 crc kubenswrapper[5036]: I0110 16:38:12.716258 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-f8fb84555-cdrlk" Jan 10 16:38:25 crc kubenswrapper[5036]: I0110 16:38:25.904305 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 16:38:25 crc kubenswrapper[5036]: I0110 16:38:25.904873 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 16:38:25 crc kubenswrapper[5036]: I0110 16:38:25.904926 5036 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 16:38:25 crc kubenswrapper[5036]: I0110 16:38:25.905591 5036 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"20378322ebd3e7842d8359f595988c5cc568fc1291f3096caa536a9fbcf9d4b2"} pod="openshift-machine-config-operator/machine-config-daemon-kqphb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 10 16:38:25 crc kubenswrapper[5036]: I0110 16:38:25.905656 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" containerID="cri-o://20378322ebd3e7842d8359f595988c5cc568fc1291f3096caa536a9fbcf9d4b2" gracePeriod=600 Jan 10 16:38:25 crc kubenswrapper[5036]: I0110 16:38:25.961101 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms"] Jan 10 16:38:25 crc kubenswrapper[5036]: I0110 16:38:25.962578 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms" Jan 10 16:38:25 crc kubenswrapper[5036]: I0110 16:38:25.964311 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 10 16:38:25 crc kubenswrapper[5036]: I0110 16:38:25.968241 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms"] Jan 10 16:38:26 crc kubenswrapper[5036]: I0110 16:38:26.022544 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gkvf\" (UniqueName: \"kubernetes.io/projected/56b82ef1-8690-4ba9-9ebe-1ce6b933df2b-kube-api-access-4gkvf\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms\" (UID: \"56b82ef1-8690-4ba9-9ebe-1ce6b933df2b\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms" Jan 10 16:38:26 crc kubenswrapper[5036]: I0110 16:38:26.022607 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56b82ef1-8690-4ba9-9ebe-1ce6b933df2b-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms\" (UID: \"56b82ef1-8690-4ba9-9ebe-1ce6b933df2b\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms" Jan 10 16:38:26 crc kubenswrapper[5036]: I0110 16:38:26.022668 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56b82ef1-8690-4ba9-9ebe-1ce6b933df2b-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms\" (UID: \"56b82ef1-8690-4ba9-9ebe-1ce6b933df2b\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms" Jan 10 16:38:26 crc kubenswrapper[5036]: I0110 16:38:26.123392 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gkvf\" (UniqueName: \"kubernetes.io/projected/56b82ef1-8690-4ba9-9ebe-1ce6b933df2b-kube-api-access-4gkvf\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms\" (UID: \"56b82ef1-8690-4ba9-9ebe-1ce6b933df2b\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms" Jan 10 16:38:26 crc kubenswrapper[5036]: I0110 16:38:26.123846 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56b82ef1-8690-4ba9-9ebe-1ce6b933df2b-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms\" (UID: \"56b82ef1-8690-4ba9-9ebe-1ce6b933df2b\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms" Jan 10 16:38:26 crc kubenswrapper[5036]: I0110 16:38:26.124372 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56b82ef1-8690-4ba9-9ebe-1ce6b933df2b-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms\" (UID: \"56b82ef1-8690-4ba9-9ebe-1ce6b933df2b\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms" Jan 10 16:38:26 crc kubenswrapper[5036]: I0110 16:38:26.124516 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56b82ef1-8690-4ba9-9ebe-1ce6b933df2b-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms\" (UID: \"56b82ef1-8690-4ba9-9ebe-1ce6b933df2b\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms" Jan 10 16:38:26 crc kubenswrapper[5036]: I0110 16:38:26.124865 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56b82ef1-8690-4ba9-9ebe-1ce6b933df2b-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms\" (UID: \"56b82ef1-8690-4ba9-9ebe-1ce6b933df2b\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms" Jan 10 16:38:26 crc kubenswrapper[5036]: I0110 16:38:26.156775 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gkvf\" (UniqueName: \"kubernetes.io/projected/56b82ef1-8690-4ba9-9ebe-1ce6b933df2b-kube-api-access-4gkvf\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms\" (UID: \"56b82ef1-8690-4ba9-9ebe-1ce6b933df2b\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms" Jan 10 16:38:26 crc kubenswrapper[5036]: I0110 16:38:26.277027 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms" Jan 10 16:38:26 crc kubenswrapper[5036]: I0110 16:38:26.699653 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms"] Jan 10 16:38:26 crc kubenswrapper[5036]: W0110 16:38:26.707557 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56b82ef1_8690_4ba9_9ebe_1ce6b933df2b.slice/crio-a66b85b894e97dab4b7738ab59d3b7ba08b3e7cea964af23e9082bf9496a5de6 WatchSource:0}: Error finding container a66b85b894e97dab4b7738ab59d3b7ba08b3e7cea964af23e9082bf9496a5de6: Status 404 returned error can't find the container with id a66b85b894e97dab4b7738ab59d3b7ba08b3e7cea964af23e9082bf9496a5de6 Jan 10 16:38:26 crc kubenswrapper[5036]: I0110 16:38:26.750572 5036 generic.go:334] "Generic (PLEG): container finished" podID="79756361-741e-4470-831b-6ee092bc6277" containerID="20378322ebd3e7842d8359f595988c5cc568fc1291f3096caa536a9fbcf9d4b2" exitCode=0 Jan 10 16:38:26 crc kubenswrapper[5036]: I0110 16:38:26.750652 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerDied","Data":"20378322ebd3e7842d8359f595988c5cc568fc1291f3096caa536a9fbcf9d4b2"} Jan 10 16:38:26 crc kubenswrapper[5036]: I0110 16:38:26.750991 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerStarted","Data":"5ab5f37cb035aad8d11f5d80baed8e115b668e21b971e58b556adfab87217a78"} Jan 10 16:38:26 crc kubenswrapper[5036]: I0110 16:38:26.751012 5036 scope.go:117] "RemoveContainer" containerID="af7debfeb8d3a1dfa2638975b895daa7ecdb2dc663d2c78b9975fbe6f240f10a" Jan 10 16:38:26 crc kubenswrapper[5036]: I0110 16:38:26.753740 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms" event={"ID":"56b82ef1-8690-4ba9-9ebe-1ce6b933df2b","Type":"ContainerStarted","Data":"a66b85b894e97dab4b7738ab59d3b7ba08b3e7cea964af23e9082bf9496a5de6"} Jan 10 16:38:27 crc kubenswrapper[5036]: I0110 16:38:27.764502 5036 generic.go:334] "Generic (PLEG): container finished" podID="56b82ef1-8690-4ba9-9ebe-1ce6b933df2b" containerID="19fc80bd560a71e0fbf5e067abfaa11cb9677bd7f84017d54da7a60ca090b9e7" exitCode=0 Jan 10 16:38:27 crc kubenswrapper[5036]: I0110 16:38:27.764768 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms" event={"ID":"56b82ef1-8690-4ba9-9ebe-1ce6b933df2b","Type":"ContainerDied","Data":"19fc80bd560a71e0fbf5e067abfaa11cb9677bd7f84017d54da7a60ca090b9e7"} Jan 10 16:38:28 crc kubenswrapper[5036]: I0110 16:38:28.727988 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-bvg6n" podUID="d1559e8b-1a4d-4929-80cc-235f23048dd6" containerName="console" containerID="cri-o://b7327ce6e4431da8f55172479aa4b06c3729180dd85ce0cd8223bc14a304c2f9" gracePeriod=15 Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.093620 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-bvg6n_d1559e8b-1a4d-4929-80cc-235f23048dd6/console/0.log" Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.093946 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.163386 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgqkh\" (UniqueName: \"kubernetes.io/projected/d1559e8b-1a4d-4929-80cc-235f23048dd6-kube-api-access-kgqkh\") pod \"d1559e8b-1a4d-4929-80cc-235f23048dd6\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.163446 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1559e8b-1a4d-4929-80cc-235f23048dd6-console-serving-cert\") pod \"d1559e8b-1a4d-4929-80cc-235f23048dd6\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.163492 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1559e8b-1a4d-4929-80cc-235f23048dd6-service-ca\") pod \"d1559e8b-1a4d-4929-80cc-235f23048dd6\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.163519 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1559e8b-1a4d-4929-80cc-235f23048dd6-trusted-ca-bundle\") pod \"d1559e8b-1a4d-4929-80cc-235f23048dd6\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.163558 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1559e8b-1a4d-4929-80cc-235f23048dd6-console-oauth-config\") pod \"d1559e8b-1a4d-4929-80cc-235f23048dd6\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.163596 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1559e8b-1a4d-4929-80cc-235f23048dd6-console-config\") pod \"d1559e8b-1a4d-4929-80cc-235f23048dd6\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.163620 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1559e8b-1a4d-4929-80cc-235f23048dd6-oauth-serving-cert\") pod \"d1559e8b-1a4d-4929-80cc-235f23048dd6\" (UID: \"d1559e8b-1a4d-4929-80cc-235f23048dd6\") " Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.164946 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1559e8b-1a4d-4929-80cc-235f23048dd6-console-config" (OuterVolumeSpecName: "console-config") pod "d1559e8b-1a4d-4929-80cc-235f23048dd6" (UID: "d1559e8b-1a4d-4929-80cc-235f23048dd6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.164972 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1559e8b-1a4d-4929-80cc-235f23048dd6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d1559e8b-1a4d-4929-80cc-235f23048dd6" (UID: "d1559e8b-1a4d-4929-80cc-235f23048dd6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.164956 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1559e8b-1a4d-4929-80cc-235f23048dd6-service-ca" (OuterVolumeSpecName: "service-ca") pod "d1559e8b-1a4d-4929-80cc-235f23048dd6" (UID: "d1559e8b-1a4d-4929-80cc-235f23048dd6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.166036 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1559e8b-1a4d-4929-80cc-235f23048dd6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d1559e8b-1a4d-4929-80cc-235f23048dd6" (UID: "d1559e8b-1a4d-4929-80cc-235f23048dd6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.171291 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1559e8b-1a4d-4929-80cc-235f23048dd6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d1559e8b-1a4d-4929-80cc-235f23048dd6" (UID: "d1559e8b-1a4d-4929-80cc-235f23048dd6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.171408 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1559e8b-1a4d-4929-80cc-235f23048dd6-kube-api-access-kgqkh" (OuterVolumeSpecName: "kube-api-access-kgqkh") pod "d1559e8b-1a4d-4929-80cc-235f23048dd6" (UID: "d1559e8b-1a4d-4929-80cc-235f23048dd6"). InnerVolumeSpecName "kube-api-access-kgqkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.172462 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1559e8b-1a4d-4929-80cc-235f23048dd6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d1559e8b-1a4d-4929-80cc-235f23048dd6" (UID: "d1559e8b-1a4d-4929-80cc-235f23048dd6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.266115 5036 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1559e8b-1a4d-4929-80cc-235f23048dd6-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.266191 5036 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1559e8b-1a4d-4929-80cc-235f23048dd6-console-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.266208 5036 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1559e8b-1a4d-4929-80cc-235f23048dd6-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.266221 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgqkh\" (UniqueName: \"kubernetes.io/projected/d1559e8b-1a4d-4929-80cc-235f23048dd6-kube-api-access-kgqkh\") on node \"crc\" DevicePath \"\"" Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.266237 5036 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1559e8b-1a4d-4929-80cc-235f23048dd6-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.266251 5036 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1559e8b-1a4d-4929-80cc-235f23048dd6-service-ca\") on node \"crc\" DevicePath \"\"" Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.266261 5036 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1559e8b-1a4d-4929-80cc-235f23048dd6-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.781151 5036 generic.go:334] "Generic (PLEG): container finished" podID="56b82ef1-8690-4ba9-9ebe-1ce6b933df2b" containerID="867e825f37515de7c43fb4059e1637aa095732a7ba6286d5b414a0509bc0c997" exitCode=0 Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.781225 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms" event={"ID":"56b82ef1-8690-4ba9-9ebe-1ce6b933df2b","Type":"ContainerDied","Data":"867e825f37515de7c43fb4059e1637aa095732a7ba6286d5b414a0509bc0c997"} Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.784550 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-bvg6n_d1559e8b-1a4d-4929-80cc-235f23048dd6/console/0.log" Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.784607 5036 generic.go:334] "Generic (PLEG): container finished" podID="d1559e8b-1a4d-4929-80cc-235f23048dd6" containerID="b7327ce6e4431da8f55172479aa4b06c3729180dd85ce0cd8223bc14a304c2f9" exitCode=2 Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.784643 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bvg6n" event={"ID":"d1559e8b-1a4d-4929-80cc-235f23048dd6","Type":"ContainerDied","Data":"b7327ce6e4431da8f55172479aa4b06c3729180dd85ce0cd8223bc14a304c2f9"} Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.784713 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bvg6n" event={"ID":"d1559e8b-1a4d-4929-80cc-235f23048dd6","Type":"ContainerDied","Data":"8e59b598f5cad56eec6dda20ae3a2de67421836d77c3daf83c09affa4e787dbe"} Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.784734 5036 scope.go:117] "RemoveContainer" containerID="b7327ce6e4431da8f55172479aa4b06c3729180dd85ce0cd8223bc14a304c2f9" Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.784743 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bvg6n" Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.830794 5036 scope.go:117] "RemoveContainer" containerID="b7327ce6e4431da8f55172479aa4b06c3729180dd85ce0cd8223bc14a304c2f9" Jan 10 16:38:29 crc kubenswrapper[5036]: E0110 16:38:29.831603 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7327ce6e4431da8f55172479aa4b06c3729180dd85ce0cd8223bc14a304c2f9\": container with ID starting with b7327ce6e4431da8f55172479aa4b06c3729180dd85ce0cd8223bc14a304c2f9 not found: ID does not exist" containerID="b7327ce6e4431da8f55172479aa4b06c3729180dd85ce0cd8223bc14a304c2f9" Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.831721 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7327ce6e4431da8f55172479aa4b06c3729180dd85ce0cd8223bc14a304c2f9"} err="failed to get container status \"b7327ce6e4431da8f55172479aa4b06c3729180dd85ce0cd8223bc14a304c2f9\": rpc error: code = NotFound desc = could not find container \"b7327ce6e4431da8f55172479aa4b06c3729180dd85ce0cd8223bc14a304c2f9\": container with ID starting with b7327ce6e4431da8f55172479aa4b06c3729180dd85ce0cd8223bc14a304c2f9 not found: ID does not exist" Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.849426 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-bvg6n"] Jan 10 16:38:29 crc kubenswrapper[5036]: I0110 16:38:29.853490 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-bvg6n"] Jan 10 16:38:30 crc kubenswrapper[5036]: I0110 16:38:30.515235 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1559e8b-1a4d-4929-80cc-235f23048dd6" path="/var/lib/kubelet/pods/d1559e8b-1a4d-4929-80cc-235f23048dd6/volumes" Jan 10 16:38:31 crc kubenswrapper[5036]: I0110 16:38:31.807182 5036 generic.go:334] "Generic (PLEG): container finished" podID="56b82ef1-8690-4ba9-9ebe-1ce6b933df2b" containerID="b4f85ce5eaf38c116b12f31a05532360cd41771caa8806cacfad74725ce48ee8" exitCode=0 Jan 10 16:38:31 crc kubenswrapper[5036]: I0110 16:38:31.807245 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms" event={"ID":"56b82ef1-8690-4ba9-9ebe-1ce6b933df2b","Type":"ContainerDied","Data":"b4f85ce5eaf38c116b12f31a05532360cd41771caa8806cacfad74725ce48ee8"} Jan 10 16:38:33 crc kubenswrapper[5036]: I0110 16:38:33.061859 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms" Jan 10 16:38:33 crc kubenswrapper[5036]: I0110 16:38:33.117567 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56b82ef1-8690-4ba9-9ebe-1ce6b933df2b-bundle\") pod \"56b82ef1-8690-4ba9-9ebe-1ce6b933df2b\" (UID: \"56b82ef1-8690-4ba9-9ebe-1ce6b933df2b\") " Jan 10 16:38:33 crc kubenswrapper[5036]: I0110 16:38:33.117672 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56b82ef1-8690-4ba9-9ebe-1ce6b933df2b-util\") pod \"56b82ef1-8690-4ba9-9ebe-1ce6b933df2b\" (UID: \"56b82ef1-8690-4ba9-9ebe-1ce6b933df2b\") " Jan 10 16:38:33 crc kubenswrapper[5036]: I0110 16:38:33.118717 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gkvf\" (UniqueName: \"kubernetes.io/projected/56b82ef1-8690-4ba9-9ebe-1ce6b933df2b-kube-api-access-4gkvf\") pod \"56b82ef1-8690-4ba9-9ebe-1ce6b933df2b\" (UID: \"56b82ef1-8690-4ba9-9ebe-1ce6b933df2b\") " Jan 10 16:38:33 crc kubenswrapper[5036]: I0110 16:38:33.119033 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56b82ef1-8690-4ba9-9ebe-1ce6b933df2b-bundle" (OuterVolumeSpecName: "bundle") pod "56b82ef1-8690-4ba9-9ebe-1ce6b933df2b" (UID: "56b82ef1-8690-4ba9-9ebe-1ce6b933df2b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:38:33 crc kubenswrapper[5036]: I0110 16:38:33.124022 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56b82ef1-8690-4ba9-9ebe-1ce6b933df2b-kube-api-access-4gkvf" (OuterVolumeSpecName: "kube-api-access-4gkvf") pod "56b82ef1-8690-4ba9-9ebe-1ce6b933df2b" (UID: "56b82ef1-8690-4ba9-9ebe-1ce6b933df2b"). InnerVolumeSpecName "kube-api-access-4gkvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:38:33 crc kubenswrapper[5036]: I0110 16:38:33.130187 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56b82ef1-8690-4ba9-9ebe-1ce6b933df2b-util" (OuterVolumeSpecName: "util") pod "56b82ef1-8690-4ba9-9ebe-1ce6b933df2b" (UID: "56b82ef1-8690-4ba9-9ebe-1ce6b933df2b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:38:33 crc kubenswrapper[5036]: I0110 16:38:33.220142 5036 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56b82ef1-8690-4ba9-9ebe-1ce6b933df2b-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:38:33 crc kubenswrapper[5036]: I0110 16:38:33.220182 5036 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56b82ef1-8690-4ba9-9ebe-1ce6b933df2b-util\") on node \"crc\" DevicePath \"\"" Jan 10 16:38:33 crc kubenswrapper[5036]: I0110 16:38:33.220196 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gkvf\" (UniqueName: \"kubernetes.io/projected/56b82ef1-8690-4ba9-9ebe-1ce6b933df2b-kube-api-access-4gkvf\") on node \"crc\" DevicePath \"\"" Jan 10 16:38:33 crc kubenswrapper[5036]: I0110 16:38:33.824194 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms" event={"ID":"56b82ef1-8690-4ba9-9ebe-1ce6b933df2b","Type":"ContainerDied","Data":"a66b85b894e97dab4b7738ab59d3b7ba08b3e7cea964af23e9082bf9496a5de6"} Jan 10 16:38:33 crc kubenswrapper[5036]: I0110 16:38:33.824258 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms" Jan 10 16:38:33 crc kubenswrapper[5036]: I0110 16:38:33.824262 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a66b85b894e97dab4b7738ab59d3b7ba08b3e7cea964af23e9082bf9496a5de6" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.265555 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-57fdf6dfbb-rvjhl"] Jan 10 16:38:45 crc kubenswrapper[5036]: E0110 16:38:45.266461 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1559e8b-1a4d-4929-80cc-235f23048dd6" containerName="console" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.266486 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1559e8b-1a4d-4929-80cc-235f23048dd6" containerName="console" Jan 10 16:38:45 crc kubenswrapper[5036]: E0110 16:38:45.266496 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56b82ef1-8690-4ba9-9ebe-1ce6b933df2b" containerName="util" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.266502 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="56b82ef1-8690-4ba9-9ebe-1ce6b933df2b" containerName="util" Jan 10 16:38:45 crc kubenswrapper[5036]: E0110 16:38:45.266511 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56b82ef1-8690-4ba9-9ebe-1ce6b933df2b" containerName="extract" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.266518 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="56b82ef1-8690-4ba9-9ebe-1ce6b933df2b" containerName="extract" Jan 10 16:38:45 crc kubenswrapper[5036]: E0110 16:38:45.266542 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56b82ef1-8690-4ba9-9ebe-1ce6b933df2b" containerName="pull" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.266548 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="56b82ef1-8690-4ba9-9ebe-1ce6b933df2b" containerName="pull" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.266671 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="56b82ef1-8690-4ba9-9ebe-1ce6b933df2b" containerName="extract" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.266742 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1559e8b-1a4d-4929-80cc-235f23048dd6" containerName="console" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.267203 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-57fdf6dfbb-rvjhl" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.268977 5036 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.269160 5036 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-2wsms" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.269454 5036 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.269574 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.276279 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.284722 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-57fdf6dfbb-rvjhl"] Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.376595 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrgpf\" (UniqueName: \"kubernetes.io/projected/27d72d19-58d6-4094-8d3a-826354e6bb02-kube-api-access-zrgpf\") pod \"metallb-operator-controller-manager-57fdf6dfbb-rvjhl\" (UID: \"27d72d19-58d6-4094-8d3a-826354e6bb02\") " pod="metallb-system/metallb-operator-controller-manager-57fdf6dfbb-rvjhl" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.376690 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/27d72d19-58d6-4094-8d3a-826354e6bb02-webhook-cert\") pod \"metallb-operator-controller-manager-57fdf6dfbb-rvjhl\" (UID: \"27d72d19-58d6-4094-8d3a-826354e6bb02\") " pod="metallb-system/metallb-operator-controller-manager-57fdf6dfbb-rvjhl" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.376730 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/27d72d19-58d6-4094-8d3a-826354e6bb02-apiservice-cert\") pod \"metallb-operator-controller-manager-57fdf6dfbb-rvjhl\" (UID: \"27d72d19-58d6-4094-8d3a-826354e6bb02\") " pod="metallb-system/metallb-operator-controller-manager-57fdf6dfbb-rvjhl" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.477953 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/27d72d19-58d6-4094-8d3a-826354e6bb02-apiservice-cert\") pod \"metallb-operator-controller-manager-57fdf6dfbb-rvjhl\" (UID: \"27d72d19-58d6-4094-8d3a-826354e6bb02\") " pod="metallb-system/metallb-operator-controller-manager-57fdf6dfbb-rvjhl" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.478041 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrgpf\" (UniqueName: \"kubernetes.io/projected/27d72d19-58d6-4094-8d3a-826354e6bb02-kube-api-access-zrgpf\") pod \"metallb-operator-controller-manager-57fdf6dfbb-rvjhl\" (UID: \"27d72d19-58d6-4094-8d3a-826354e6bb02\") " pod="metallb-system/metallb-operator-controller-manager-57fdf6dfbb-rvjhl" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.478108 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/27d72d19-58d6-4094-8d3a-826354e6bb02-webhook-cert\") pod \"metallb-operator-controller-manager-57fdf6dfbb-rvjhl\" (UID: \"27d72d19-58d6-4094-8d3a-826354e6bb02\") " pod="metallb-system/metallb-operator-controller-manager-57fdf6dfbb-rvjhl" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.484058 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/27d72d19-58d6-4094-8d3a-826354e6bb02-apiservice-cert\") pod \"metallb-operator-controller-manager-57fdf6dfbb-rvjhl\" (UID: \"27d72d19-58d6-4094-8d3a-826354e6bb02\") " pod="metallb-system/metallb-operator-controller-manager-57fdf6dfbb-rvjhl" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.485221 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/27d72d19-58d6-4094-8d3a-826354e6bb02-webhook-cert\") pod \"metallb-operator-controller-manager-57fdf6dfbb-rvjhl\" (UID: \"27d72d19-58d6-4094-8d3a-826354e6bb02\") " pod="metallb-system/metallb-operator-controller-manager-57fdf6dfbb-rvjhl" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.514701 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrgpf\" (UniqueName: \"kubernetes.io/projected/27d72d19-58d6-4094-8d3a-826354e6bb02-kube-api-access-zrgpf\") pod \"metallb-operator-controller-manager-57fdf6dfbb-rvjhl\" (UID: \"27d72d19-58d6-4094-8d3a-826354e6bb02\") " pod="metallb-system/metallb-operator-controller-manager-57fdf6dfbb-rvjhl" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.626284 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-86bf866985-6ggxt"] Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.626532 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-57fdf6dfbb-rvjhl" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.627155 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-86bf866985-6ggxt" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.629076 5036 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.629086 5036 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.629319 5036 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-p827g" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.680491 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqsgb\" (UniqueName: \"kubernetes.io/projected/12be6fd4-c97c-439e-8a06-3769f37d7b48-kube-api-access-vqsgb\") pod \"metallb-operator-webhook-server-86bf866985-6ggxt\" (UID: \"12be6fd4-c97c-439e-8a06-3769f37d7b48\") " pod="metallb-system/metallb-operator-webhook-server-86bf866985-6ggxt" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.680846 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/12be6fd4-c97c-439e-8a06-3769f37d7b48-webhook-cert\") pod \"metallb-operator-webhook-server-86bf866985-6ggxt\" (UID: \"12be6fd4-c97c-439e-8a06-3769f37d7b48\") " pod="metallb-system/metallb-operator-webhook-server-86bf866985-6ggxt" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.680876 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/12be6fd4-c97c-439e-8a06-3769f37d7b48-apiservice-cert\") pod \"metallb-operator-webhook-server-86bf866985-6ggxt\" (UID: \"12be6fd4-c97c-439e-8a06-3769f37d7b48\") " pod="metallb-system/metallb-operator-webhook-server-86bf866985-6ggxt" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.689862 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-86bf866985-6ggxt"] Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.791315 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/12be6fd4-c97c-439e-8a06-3769f37d7b48-webhook-cert\") pod \"metallb-operator-webhook-server-86bf866985-6ggxt\" (UID: \"12be6fd4-c97c-439e-8a06-3769f37d7b48\") " pod="metallb-system/metallb-operator-webhook-server-86bf866985-6ggxt" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.791363 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/12be6fd4-c97c-439e-8a06-3769f37d7b48-apiservice-cert\") pod \"metallb-operator-webhook-server-86bf866985-6ggxt\" (UID: \"12be6fd4-c97c-439e-8a06-3769f37d7b48\") " pod="metallb-system/metallb-operator-webhook-server-86bf866985-6ggxt" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.791398 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqsgb\" (UniqueName: \"kubernetes.io/projected/12be6fd4-c97c-439e-8a06-3769f37d7b48-kube-api-access-vqsgb\") pod \"metallb-operator-webhook-server-86bf866985-6ggxt\" (UID: \"12be6fd4-c97c-439e-8a06-3769f37d7b48\") " pod="metallb-system/metallb-operator-webhook-server-86bf866985-6ggxt" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.800924 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/12be6fd4-c97c-439e-8a06-3769f37d7b48-apiservice-cert\") pod \"metallb-operator-webhook-server-86bf866985-6ggxt\" (UID: \"12be6fd4-c97c-439e-8a06-3769f37d7b48\") " pod="metallb-system/metallb-operator-webhook-server-86bf866985-6ggxt" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.814596 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/12be6fd4-c97c-439e-8a06-3769f37d7b48-webhook-cert\") pod \"metallb-operator-webhook-server-86bf866985-6ggxt\" (UID: \"12be6fd4-c97c-439e-8a06-3769f37d7b48\") " pod="metallb-system/metallb-operator-webhook-server-86bf866985-6ggxt" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.837465 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqsgb\" (UniqueName: \"kubernetes.io/projected/12be6fd4-c97c-439e-8a06-3769f37d7b48-kube-api-access-vqsgb\") pod \"metallb-operator-webhook-server-86bf866985-6ggxt\" (UID: \"12be6fd4-c97c-439e-8a06-3769f37d7b48\") " pod="metallb-system/metallb-operator-webhook-server-86bf866985-6ggxt" Jan 10 16:38:45 crc kubenswrapper[5036]: I0110 16:38:45.981952 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-86bf866985-6ggxt" Jan 10 16:38:46 crc kubenswrapper[5036]: I0110 16:38:46.107126 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-57fdf6dfbb-rvjhl"] Jan 10 16:38:46 crc kubenswrapper[5036]: W0110 16:38:46.116613 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27d72d19_58d6_4094_8d3a_826354e6bb02.slice/crio-443af3d01f73421cbb0264857ba6cb1a73b30d11432793a9fde9e80ef589d1bf WatchSource:0}: Error finding container 443af3d01f73421cbb0264857ba6cb1a73b30d11432793a9fde9e80ef589d1bf: Status 404 returned error can't find the container with id 443af3d01f73421cbb0264857ba6cb1a73b30d11432793a9fde9e80ef589d1bf Jan 10 16:38:46 crc kubenswrapper[5036]: I0110 16:38:46.165388 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-86bf866985-6ggxt"] Jan 10 16:38:46 crc kubenswrapper[5036]: W0110 16:38:46.173087 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12be6fd4_c97c_439e_8a06_3769f37d7b48.slice/crio-9cdbe9e48e9578f14db38da53f413bf85613b5d2855c77ec0cc41d0a29069826 WatchSource:0}: Error finding container 9cdbe9e48e9578f14db38da53f413bf85613b5d2855c77ec0cc41d0a29069826: Status 404 returned error can't find the container with id 9cdbe9e48e9578f14db38da53f413bf85613b5d2855c77ec0cc41d0a29069826 Jan 10 16:38:46 crc kubenswrapper[5036]: I0110 16:38:46.900619 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-57fdf6dfbb-rvjhl" event={"ID":"27d72d19-58d6-4094-8d3a-826354e6bb02","Type":"ContainerStarted","Data":"443af3d01f73421cbb0264857ba6cb1a73b30d11432793a9fde9e80ef589d1bf"} Jan 10 16:38:46 crc kubenswrapper[5036]: I0110 16:38:46.901960 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-86bf866985-6ggxt" event={"ID":"12be6fd4-c97c-439e-8a06-3769f37d7b48","Type":"ContainerStarted","Data":"9cdbe9e48e9578f14db38da53f413bf85613b5d2855c77ec0cc41d0a29069826"} Jan 10 16:38:48 crc kubenswrapper[5036]: I0110 16:38:48.088620 5036 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 10 16:38:51 crc kubenswrapper[5036]: I0110 16:38:51.929545 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-57fdf6dfbb-rvjhl" event={"ID":"27d72d19-58d6-4094-8d3a-826354e6bb02","Type":"ContainerStarted","Data":"429aeaa490bbc02c7340b49389c511519040a84a53e3895030600d81ec492c06"} Jan 10 16:38:51 crc kubenswrapper[5036]: I0110 16:38:51.930100 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-57fdf6dfbb-rvjhl" Jan 10 16:38:51 crc kubenswrapper[5036]: I0110 16:38:51.931025 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-86bf866985-6ggxt" event={"ID":"12be6fd4-c97c-439e-8a06-3769f37d7b48","Type":"ContainerStarted","Data":"abf116242d06779f1432e7653a2462e9a2e213a04754da9d51dd0552d9af3f17"} Jan 10 16:38:51 crc kubenswrapper[5036]: I0110 16:38:51.931155 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-86bf866985-6ggxt" Jan 10 16:38:51 crc kubenswrapper[5036]: I0110 16:38:51.956582 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-57fdf6dfbb-rvjhl" podStartSLOduration=2.078919184 podStartE2EDuration="6.956565674s" podCreationTimestamp="2026-01-10 16:38:45 +0000 UTC" firstStartedPulling="2026-01-10 16:38:46.120104521 +0000 UTC m=+647.990340015" lastFinishedPulling="2026-01-10 16:38:50.997751011 +0000 UTC m=+652.867986505" observedRunningTime="2026-01-10 16:38:51.952259422 +0000 UTC m=+653.822494916" watchObservedRunningTime="2026-01-10 16:38:51.956565674 +0000 UTC m=+653.826801168" Jan 10 16:38:51 crc kubenswrapper[5036]: I0110 16:38:51.974777 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-86bf866985-6ggxt" podStartSLOduration=2.138989046 podStartE2EDuration="6.974757947s" podCreationTimestamp="2026-01-10 16:38:45 +0000 UTC" firstStartedPulling="2026-01-10 16:38:46.175039039 +0000 UTC m=+648.045274533" lastFinishedPulling="2026-01-10 16:38:51.01080794 +0000 UTC m=+652.881043434" observedRunningTime="2026-01-10 16:38:51.970130937 +0000 UTC m=+653.840366451" watchObservedRunningTime="2026-01-10 16:38:51.974757947 +0000 UTC m=+653.844993441" Jan 10 16:39:05 crc kubenswrapper[5036]: I0110 16:39:05.988776 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-86bf866985-6ggxt" Jan 10 16:39:25 crc kubenswrapper[5036]: I0110 16:39:25.630991 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-57fdf6dfbb-rvjhl" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.319639 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-vvnr7"] Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.324317 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-qzgnv"] Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.324562 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.325980 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qzgnv" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.329632 5036 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.329718 5036 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.329923 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.329888 5036 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-hgfjx" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.364744 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-qzgnv"] Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.436637 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-bxfjm"] Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.437984 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-bxfjm" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.446267 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.448183 5036 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.449648 5036 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.455117 5036 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-hvhvx" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.488297 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5bddd4b946-5xsxl"] Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.489556 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bddd4b946-5xsxl" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.491736 5036 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.501309 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bddd4b946-5xsxl"] Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.518348 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24cab86b-603a-48b4-9b8f-add5e9a79f7b-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-qzgnv\" (UID: \"24cab86b-603a-48b4-9b8f-add5e9a79f7b\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qzgnv" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.518445 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/afae39b8-393f-46d1-a436-512d9ba68c25-metrics\") pod \"frr-k8s-vvnr7\" (UID: \"afae39b8-393f-46d1-a436-512d9ba68c25\") " pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.518547 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/afae39b8-393f-46d1-a436-512d9ba68c25-frr-sockets\") pod \"frr-k8s-vvnr7\" (UID: \"afae39b8-393f-46d1-a436-512d9ba68c25\") " pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.518588 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afae39b8-393f-46d1-a436-512d9ba68c25-metrics-certs\") pod \"frr-k8s-vvnr7\" (UID: \"afae39b8-393f-46d1-a436-512d9ba68c25\") " pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.518637 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmldz\" (UniqueName: \"kubernetes.io/projected/24cab86b-603a-48b4-9b8f-add5e9a79f7b-kube-api-access-mmldz\") pod \"frr-k8s-webhook-server-7784b6fcf-qzgnv\" (UID: \"24cab86b-603a-48b4-9b8f-add5e9a79f7b\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qzgnv" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.518664 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/afae39b8-393f-46d1-a436-512d9ba68c25-reloader\") pod \"frr-k8s-vvnr7\" (UID: \"afae39b8-393f-46d1-a436-512d9ba68c25\") " pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.518697 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/afae39b8-393f-46d1-a436-512d9ba68c25-frr-startup\") pod \"frr-k8s-vvnr7\" (UID: \"afae39b8-393f-46d1-a436-512d9ba68c25\") " pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.518790 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf6jr\" (UniqueName: \"kubernetes.io/projected/afae39b8-393f-46d1-a436-512d9ba68c25-kube-api-access-zf6jr\") pod \"frr-k8s-vvnr7\" (UID: \"afae39b8-393f-46d1-a436-512d9ba68c25\") " pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.519359 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/afae39b8-393f-46d1-a436-512d9ba68c25-frr-conf\") pod \"frr-k8s-vvnr7\" (UID: \"afae39b8-393f-46d1-a436-512d9ba68c25\") " pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.620894 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d60af08-1ea0-49e4-aa55-8f9bfa63b34b-metrics-certs\") pod \"speaker-bxfjm\" (UID: \"6d60af08-1ea0-49e4-aa55-8f9bfa63b34b\") " pod="metallb-system/speaker-bxfjm" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.621230 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/afae39b8-393f-46d1-a436-512d9ba68c25-metrics\") pod \"frr-k8s-vvnr7\" (UID: \"afae39b8-393f-46d1-a436-512d9ba68c25\") " pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.621324 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/afae39b8-393f-46d1-a436-512d9ba68c25-frr-sockets\") pod \"frr-k8s-vvnr7\" (UID: \"afae39b8-393f-46d1-a436-512d9ba68c25\") " pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.621410 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afae39b8-393f-46d1-a436-512d9ba68c25-metrics-certs\") pod \"frr-k8s-vvnr7\" (UID: \"afae39b8-393f-46d1-a436-512d9ba68c25\") " pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.621481 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tpv8\" (UniqueName: \"kubernetes.io/projected/6d60af08-1ea0-49e4-aa55-8f9bfa63b34b-kube-api-access-9tpv8\") pod \"speaker-bxfjm\" (UID: \"6d60af08-1ea0-49e4-aa55-8f9bfa63b34b\") " pod="metallb-system/speaker-bxfjm" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.621597 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69-metrics-certs\") pod \"controller-5bddd4b946-5xsxl\" (UID: \"fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69\") " pod="metallb-system/controller-5bddd4b946-5xsxl" Jan 10 16:39:26 crc kubenswrapper[5036]: E0110 16:39:26.621669 5036 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Jan 10 16:39:26 crc kubenswrapper[5036]: E0110 16:39:26.621760 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afae39b8-393f-46d1-a436-512d9ba68c25-metrics-certs podName:afae39b8-393f-46d1-a436-512d9ba68c25 nodeName:}" failed. No retries permitted until 2026-01-10 16:39:27.121742237 +0000 UTC m=+688.991977731 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/afae39b8-393f-46d1-a436-512d9ba68c25-metrics-certs") pod "frr-k8s-vvnr7" (UID: "afae39b8-393f-46d1-a436-512d9ba68c25") : secret "frr-k8s-certs-secret" not found Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.621835 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69-cert\") pod \"controller-5bddd4b946-5xsxl\" (UID: \"fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69\") " pod="metallb-system/controller-5bddd4b946-5xsxl" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.621906 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmldz\" (UniqueName: \"kubernetes.io/projected/24cab86b-603a-48b4-9b8f-add5e9a79f7b-kube-api-access-mmldz\") pod \"frr-k8s-webhook-server-7784b6fcf-qzgnv\" (UID: \"24cab86b-603a-48b4-9b8f-add5e9a79f7b\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qzgnv" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.621941 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/afae39b8-393f-46d1-a436-512d9ba68c25-reloader\") pod \"frr-k8s-vvnr7\" (UID: \"afae39b8-393f-46d1-a436-512d9ba68c25\") " pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.621961 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/afae39b8-393f-46d1-a436-512d9ba68c25-frr-startup\") pod \"frr-k8s-vvnr7\" (UID: \"afae39b8-393f-46d1-a436-512d9ba68c25\") " pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.621998 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6d60af08-1ea0-49e4-aa55-8f9bfa63b34b-memberlist\") pod \"speaker-bxfjm\" (UID: \"6d60af08-1ea0-49e4-aa55-8f9bfa63b34b\") " pod="metallb-system/speaker-bxfjm" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.622051 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6d60af08-1ea0-49e4-aa55-8f9bfa63b34b-metallb-excludel2\") pod \"speaker-bxfjm\" (UID: \"6d60af08-1ea0-49e4-aa55-8f9bfa63b34b\") " pod="metallb-system/speaker-bxfjm" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.622088 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf6jr\" (UniqueName: \"kubernetes.io/projected/afae39b8-393f-46d1-a436-512d9ba68c25-kube-api-access-zf6jr\") pod \"frr-k8s-vvnr7\" (UID: \"afae39b8-393f-46d1-a436-512d9ba68c25\") " pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.622091 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/afae39b8-393f-46d1-a436-512d9ba68c25-metrics\") pod \"frr-k8s-vvnr7\" (UID: \"afae39b8-393f-46d1-a436-512d9ba68c25\") " pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.622124 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/afae39b8-393f-46d1-a436-512d9ba68c25-frr-conf\") pod \"frr-k8s-vvnr7\" (UID: \"afae39b8-393f-46d1-a436-512d9ba68c25\") " pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.622278 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24cab86b-603a-48b4-9b8f-add5e9a79f7b-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-qzgnv\" (UID: \"24cab86b-603a-48b4-9b8f-add5e9a79f7b\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qzgnv" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.622328 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bddj\" (UniqueName: \"kubernetes.io/projected/fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69-kube-api-access-5bddj\") pod \"controller-5bddd4b946-5xsxl\" (UID: \"fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69\") " pod="metallb-system/controller-5bddd4b946-5xsxl" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.622445 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/afae39b8-393f-46d1-a436-512d9ba68c25-frr-conf\") pod \"frr-k8s-vvnr7\" (UID: \"afae39b8-393f-46d1-a436-512d9ba68c25\") " pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:26 crc kubenswrapper[5036]: E0110 16:39:26.622554 5036 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Jan 10 16:39:26 crc kubenswrapper[5036]: E0110 16:39:26.622605 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24cab86b-603a-48b4-9b8f-add5e9a79f7b-cert podName:24cab86b-603a-48b4-9b8f-add5e9a79f7b nodeName:}" failed. No retries permitted until 2026-01-10 16:39:27.122589051 +0000 UTC m=+688.992824545 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/24cab86b-603a-48b4-9b8f-add5e9a79f7b-cert") pod "frr-k8s-webhook-server-7784b6fcf-qzgnv" (UID: "24cab86b-603a-48b4-9b8f-add5e9a79f7b") : secret "frr-k8s-webhook-server-cert" not found Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.622818 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/afae39b8-393f-46d1-a436-512d9ba68c25-reloader\") pod \"frr-k8s-vvnr7\" (UID: \"afae39b8-393f-46d1-a436-512d9ba68c25\") " pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.623010 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/afae39b8-393f-46d1-a436-512d9ba68c25-frr-startup\") pod \"frr-k8s-vvnr7\" (UID: \"afae39b8-393f-46d1-a436-512d9ba68c25\") " pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.623066 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/afae39b8-393f-46d1-a436-512d9ba68c25-frr-sockets\") pod \"frr-k8s-vvnr7\" (UID: \"afae39b8-393f-46d1-a436-512d9ba68c25\") " pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.640276 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf6jr\" (UniqueName: \"kubernetes.io/projected/afae39b8-393f-46d1-a436-512d9ba68c25-kube-api-access-zf6jr\") pod \"frr-k8s-vvnr7\" (UID: \"afae39b8-393f-46d1-a436-512d9ba68c25\") " pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.640552 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmldz\" (UniqueName: \"kubernetes.io/projected/24cab86b-603a-48b4-9b8f-add5e9a79f7b-kube-api-access-mmldz\") pod \"frr-k8s-webhook-server-7784b6fcf-qzgnv\" (UID: \"24cab86b-603a-48b4-9b8f-add5e9a79f7b\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qzgnv" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.723221 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6d60af08-1ea0-49e4-aa55-8f9bfa63b34b-metallb-excludel2\") pod \"speaker-bxfjm\" (UID: \"6d60af08-1ea0-49e4-aa55-8f9bfa63b34b\") " pod="metallb-system/speaker-bxfjm" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.723295 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bddj\" (UniqueName: \"kubernetes.io/projected/fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69-kube-api-access-5bddj\") pod \"controller-5bddd4b946-5xsxl\" (UID: \"fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69\") " pod="metallb-system/controller-5bddd4b946-5xsxl" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.723324 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d60af08-1ea0-49e4-aa55-8f9bfa63b34b-metrics-certs\") pod \"speaker-bxfjm\" (UID: \"6d60af08-1ea0-49e4-aa55-8f9bfa63b34b\") " pod="metallb-system/speaker-bxfjm" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.723368 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tpv8\" (UniqueName: \"kubernetes.io/projected/6d60af08-1ea0-49e4-aa55-8f9bfa63b34b-kube-api-access-9tpv8\") pod \"speaker-bxfjm\" (UID: \"6d60af08-1ea0-49e4-aa55-8f9bfa63b34b\") " pod="metallb-system/speaker-bxfjm" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.723390 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69-metrics-certs\") pod \"controller-5bddd4b946-5xsxl\" (UID: \"fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69\") " pod="metallb-system/controller-5bddd4b946-5xsxl" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.723408 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69-cert\") pod \"controller-5bddd4b946-5xsxl\" (UID: \"fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69\") " pod="metallb-system/controller-5bddd4b946-5xsxl" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.723438 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6d60af08-1ea0-49e4-aa55-8f9bfa63b34b-memberlist\") pod \"speaker-bxfjm\" (UID: \"6d60af08-1ea0-49e4-aa55-8f9bfa63b34b\") " pod="metallb-system/speaker-bxfjm" Jan 10 16:39:26 crc kubenswrapper[5036]: E0110 16:39:26.723551 5036 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 10 16:39:26 crc kubenswrapper[5036]: E0110 16:39:26.723604 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d60af08-1ea0-49e4-aa55-8f9bfa63b34b-memberlist podName:6d60af08-1ea0-49e4-aa55-8f9bfa63b34b nodeName:}" failed. No retries permitted until 2026-01-10 16:39:27.223588274 +0000 UTC m=+689.093823768 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6d60af08-1ea0-49e4-aa55-8f9bfa63b34b-memberlist") pod "speaker-bxfjm" (UID: "6d60af08-1ea0-49e4-aa55-8f9bfa63b34b") : secret "metallb-memberlist" not found Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.724146 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6d60af08-1ea0-49e4-aa55-8f9bfa63b34b-metallb-excludel2\") pod \"speaker-bxfjm\" (UID: \"6d60af08-1ea0-49e4-aa55-8f9bfa63b34b\") " pod="metallb-system/speaker-bxfjm" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.725360 5036 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.726960 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d60af08-1ea0-49e4-aa55-8f9bfa63b34b-metrics-certs\") pod \"speaker-bxfjm\" (UID: \"6d60af08-1ea0-49e4-aa55-8f9bfa63b34b\") " pod="metallb-system/speaker-bxfjm" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.728440 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69-metrics-certs\") pod \"controller-5bddd4b946-5xsxl\" (UID: \"fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69\") " pod="metallb-system/controller-5bddd4b946-5xsxl" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.737847 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69-cert\") pod \"controller-5bddd4b946-5xsxl\" (UID: \"fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69\") " pod="metallb-system/controller-5bddd4b946-5xsxl" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.738372 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tpv8\" (UniqueName: \"kubernetes.io/projected/6d60af08-1ea0-49e4-aa55-8f9bfa63b34b-kube-api-access-9tpv8\") pod \"speaker-bxfjm\" (UID: \"6d60af08-1ea0-49e4-aa55-8f9bfa63b34b\") " pod="metallb-system/speaker-bxfjm" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.739026 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bddj\" (UniqueName: \"kubernetes.io/projected/fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69-kube-api-access-5bddj\") pod \"controller-5bddd4b946-5xsxl\" (UID: \"fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69\") " pod="metallb-system/controller-5bddd4b946-5xsxl" Jan 10 16:39:26 crc kubenswrapper[5036]: I0110 16:39:26.804979 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bddd4b946-5xsxl" Jan 10 16:39:27 crc kubenswrapper[5036]: I0110 16:39:27.008732 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bddd4b946-5xsxl"] Jan 10 16:39:27 crc kubenswrapper[5036]: I0110 16:39:27.128040 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afae39b8-393f-46d1-a436-512d9ba68c25-metrics-certs\") pod \"frr-k8s-vvnr7\" (UID: \"afae39b8-393f-46d1-a436-512d9ba68c25\") " pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:27 crc kubenswrapper[5036]: I0110 16:39:27.128143 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24cab86b-603a-48b4-9b8f-add5e9a79f7b-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-qzgnv\" (UID: \"24cab86b-603a-48b4-9b8f-add5e9a79f7b\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qzgnv" Jan 10 16:39:27 crc kubenswrapper[5036]: I0110 16:39:27.133962 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afae39b8-393f-46d1-a436-512d9ba68c25-metrics-certs\") pod \"frr-k8s-vvnr7\" (UID: \"afae39b8-393f-46d1-a436-512d9ba68c25\") " pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:27 crc kubenswrapper[5036]: I0110 16:39:27.134619 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24cab86b-603a-48b4-9b8f-add5e9a79f7b-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-qzgnv\" (UID: \"24cab86b-603a-48b4-9b8f-add5e9a79f7b\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qzgnv" Jan 10 16:39:27 crc kubenswrapper[5036]: I0110 16:39:27.184366 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-5xsxl" event={"ID":"fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69","Type":"ContainerStarted","Data":"8923e31483257db455a3bba0a2975cbecfaa50a309b8c400389be17b1715bf05"} Jan 10 16:39:27 crc kubenswrapper[5036]: I0110 16:39:27.184426 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-5xsxl" event={"ID":"fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69","Type":"ContainerStarted","Data":"2b2605df0b2df24e23e56771387947830aed6e9d93c08a4bc9e98bb3823075b8"} Jan 10 16:39:27 crc kubenswrapper[5036]: I0110 16:39:27.230079 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6d60af08-1ea0-49e4-aa55-8f9bfa63b34b-memberlist\") pod \"speaker-bxfjm\" (UID: \"6d60af08-1ea0-49e4-aa55-8f9bfa63b34b\") " pod="metallb-system/speaker-bxfjm" Jan 10 16:39:27 crc kubenswrapper[5036]: E0110 16:39:27.230383 5036 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 10 16:39:27 crc kubenswrapper[5036]: E0110 16:39:27.230660 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d60af08-1ea0-49e4-aa55-8f9bfa63b34b-memberlist podName:6d60af08-1ea0-49e4-aa55-8f9bfa63b34b nodeName:}" failed. No retries permitted until 2026-01-10 16:39:28.230607985 +0000 UTC m=+690.100843479 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6d60af08-1ea0-49e4-aa55-8f9bfa63b34b-memberlist") pod "speaker-bxfjm" (UID: "6d60af08-1ea0-49e4-aa55-8f9bfa63b34b") : secret "metallb-memberlist" not found Jan 10 16:39:27 crc kubenswrapper[5036]: I0110 16:39:27.256511 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:27 crc kubenswrapper[5036]: I0110 16:39:27.262378 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qzgnv" Jan 10 16:39:27 crc kubenswrapper[5036]: I0110 16:39:27.725024 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-qzgnv"] Jan 10 16:39:28 crc kubenswrapper[5036]: I0110 16:39:28.190510 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vvnr7" event={"ID":"afae39b8-393f-46d1-a436-512d9ba68c25","Type":"ContainerStarted","Data":"3d2fb4abd7875d4edb22e86909a28d7d920afbb16891e02c9f0d502ceb5e1225"} Jan 10 16:39:28 crc kubenswrapper[5036]: I0110 16:39:28.192655 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-5xsxl" event={"ID":"fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69","Type":"ContainerStarted","Data":"56982e3673d79f5fb5bbb7335f24216e2036c57afdb135b506584429e339f20b"} Jan 10 16:39:28 crc kubenswrapper[5036]: I0110 16:39:28.192877 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5bddd4b946-5xsxl" Jan 10 16:39:28 crc kubenswrapper[5036]: I0110 16:39:28.193799 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qzgnv" event={"ID":"24cab86b-603a-48b4-9b8f-add5e9a79f7b","Type":"ContainerStarted","Data":"5a9c6c5efc7befe67c1d673e30fe7cc7e81229c410db4ce86918530dcb6cd569"} Jan 10 16:39:28 crc kubenswrapper[5036]: I0110 16:39:28.219995 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5bddd4b946-5xsxl" podStartSLOduration=2.219962759 podStartE2EDuration="2.219962759s" podCreationTimestamp="2026-01-10 16:39:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:39:28.213147417 +0000 UTC m=+690.083382921" watchObservedRunningTime="2026-01-10 16:39:28.219962759 +0000 UTC m=+690.090198283" Jan 10 16:39:28 crc kubenswrapper[5036]: I0110 16:39:28.241966 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6d60af08-1ea0-49e4-aa55-8f9bfa63b34b-memberlist\") pod \"speaker-bxfjm\" (UID: \"6d60af08-1ea0-49e4-aa55-8f9bfa63b34b\") " pod="metallb-system/speaker-bxfjm" Jan 10 16:39:28 crc kubenswrapper[5036]: I0110 16:39:28.248829 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6d60af08-1ea0-49e4-aa55-8f9bfa63b34b-memberlist\") pod \"speaker-bxfjm\" (UID: \"6d60af08-1ea0-49e4-aa55-8f9bfa63b34b\") " pod="metallb-system/speaker-bxfjm" Jan 10 16:39:28 crc kubenswrapper[5036]: I0110 16:39:28.254734 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-bxfjm" Jan 10 16:39:28 crc kubenswrapper[5036]: W0110 16:39:28.280493 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d60af08_1ea0_49e4_aa55_8f9bfa63b34b.slice/crio-59a40e49764d9968e2e4a94147de5fe5de2216a9333c182f5f8f860ffa832f56 WatchSource:0}: Error finding container 59a40e49764d9968e2e4a94147de5fe5de2216a9333c182f5f8f860ffa832f56: Status 404 returned error can't find the container with id 59a40e49764d9968e2e4a94147de5fe5de2216a9333c182f5f8f860ffa832f56 Jan 10 16:39:29 crc kubenswrapper[5036]: I0110 16:39:29.200907 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bxfjm" event={"ID":"6d60af08-1ea0-49e4-aa55-8f9bfa63b34b","Type":"ContainerStarted","Data":"faaaea9b4939fd5a9e4bc517b9ff47a0e03609944bf99c9b190ed2d20279e169"} Jan 10 16:39:29 crc kubenswrapper[5036]: I0110 16:39:29.201266 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bxfjm" event={"ID":"6d60af08-1ea0-49e4-aa55-8f9bfa63b34b","Type":"ContainerStarted","Data":"8f3d63e22b160134c4224660cb498c57ba7129501a91064fec80461346effd80"} Jan 10 16:39:29 crc kubenswrapper[5036]: I0110 16:39:29.201282 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bxfjm" event={"ID":"6d60af08-1ea0-49e4-aa55-8f9bfa63b34b","Type":"ContainerStarted","Data":"59a40e49764d9968e2e4a94147de5fe5de2216a9333c182f5f8f860ffa832f56"} Jan 10 16:39:29 crc kubenswrapper[5036]: I0110 16:39:29.201535 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-bxfjm" Jan 10 16:39:29 crc kubenswrapper[5036]: I0110 16:39:29.219625 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-bxfjm" podStartSLOduration=3.219605005 podStartE2EDuration="3.219605005s" podCreationTimestamp="2026-01-10 16:39:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:39:29.218975427 +0000 UTC m=+691.089210921" watchObservedRunningTime="2026-01-10 16:39:29.219605005 +0000 UTC m=+691.089840499" Jan 10 16:39:35 crc kubenswrapper[5036]: I0110 16:39:35.239340 5036 generic.go:334] "Generic (PLEG): container finished" podID="afae39b8-393f-46d1-a436-512d9ba68c25" containerID="105b5b9e5f815c763eef406f9f2b656639852b2ea892dc3d61e21c4b8e890ab3" exitCode=0 Jan 10 16:39:35 crc kubenswrapper[5036]: I0110 16:39:35.239536 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vvnr7" event={"ID":"afae39b8-393f-46d1-a436-512d9ba68c25","Type":"ContainerDied","Data":"105b5b9e5f815c763eef406f9f2b656639852b2ea892dc3d61e21c4b8e890ab3"} Jan 10 16:39:35 crc kubenswrapper[5036]: I0110 16:39:35.243143 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qzgnv" event={"ID":"24cab86b-603a-48b4-9b8f-add5e9a79f7b","Type":"ContainerStarted","Data":"20626d113833122b119f34759b0a5a95d9504823aa5d854a144e585f5afc8f8f"} Jan 10 16:39:35 crc kubenswrapper[5036]: I0110 16:39:35.243312 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qzgnv" Jan 10 16:39:35 crc kubenswrapper[5036]: I0110 16:39:35.300078 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qzgnv" podStartSLOduration=2.098692555 podStartE2EDuration="9.300058158s" podCreationTimestamp="2026-01-10 16:39:26 +0000 UTC" firstStartedPulling="2026-01-10 16:39:27.738791869 +0000 UTC m=+689.609027403" lastFinishedPulling="2026-01-10 16:39:34.940157522 +0000 UTC m=+696.810393006" observedRunningTime="2026-01-10 16:39:35.295317724 +0000 UTC m=+697.165553228" watchObservedRunningTime="2026-01-10 16:39:35.300058158 +0000 UTC m=+697.170293672" Jan 10 16:39:36 crc kubenswrapper[5036]: I0110 16:39:36.251297 5036 generic.go:334] "Generic (PLEG): container finished" podID="afae39b8-393f-46d1-a436-512d9ba68c25" containerID="7e71e309cce67d41ab14ecbcf39c7a24b3e08cca09a039e1de1f8106421784fc" exitCode=0 Jan 10 16:39:36 crc kubenswrapper[5036]: I0110 16:39:36.251394 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vvnr7" event={"ID":"afae39b8-393f-46d1-a436-512d9ba68c25","Type":"ContainerDied","Data":"7e71e309cce67d41ab14ecbcf39c7a24b3e08cca09a039e1de1f8106421784fc"} Jan 10 16:39:37 crc kubenswrapper[5036]: I0110 16:39:37.259402 5036 generic.go:334] "Generic (PLEG): container finished" podID="afae39b8-393f-46d1-a436-512d9ba68c25" containerID="4c985c0bc0dc0a2809f4a3b3a9cbfb815b8c73776460ed7371ec35c7dc2f0a90" exitCode=0 Jan 10 16:39:37 crc kubenswrapper[5036]: I0110 16:39:37.259450 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vvnr7" event={"ID":"afae39b8-393f-46d1-a436-512d9ba68c25","Type":"ContainerDied","Data":"4c985c0bc0dc0a2809f4a3b3a9cbfb815b8c73776460ed7371ec35c7dc2f0a90"} Jan 10 16:39:38 crc kubenswrapper[5036]: I0110 16:39:38.258146 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-bxfjm" Jan 10 16:39:38 crc kubenswrapper[5036]: I0110 16:39:38.271963 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vvnr7" event={"ID":"afae39b8-393f-46d1-a436-512d9ba68c25","Type":"ContainerStarted","Data":"469638d5ad5e1af712a804d58f040f021845c8b497bb2cd10a1bd4d7ec5253e2"} Jan 10 16:39:38 crc kubenswrapper[5036]: I0110 16:39:38.272009 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vvnr7" event={"ID":"afae39b8-393f-46d1-a436-512d9ba68c25","Type":"ContainerStarted","Data":"2d0318cee6bd87b877caadc0822769f447310fb63aa39aedff6a01365726c21f"} Jan 10 16:39:38 crc kubenswrapper[5036]: I0110 16:39:38.272043 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vvnr7" event={"ID":"afae39b8-393f-46d1-a436-512d9ba68c25","Type":"ContainerStarted","Data":"392206e0dd1b463829310e7f75f5b96ab013da124eb09aea5451fd313902a630"} Jan 10 16:39:38 crc kubenswrapper[5036]: I0110 16:39:38.272055 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vvnr7" event={"ID":"afae39b8-393f-46d1-a436-512d9ba68c25","Type":"ContainerStarted","Data":"11a9bfeb7de19919df9a75e7e5406bd7e5c1a13e10dcd56c1193039b06b302e5"} Jan 10 16:39:38 crc kubenswrapper[5036]: I0110 16:39:38.272066 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vvnr7" event={"ID":"afae39b8-393f-46d1-a436-512d9ba68c25","Type":"ContainerStarted","Data":"6cffe951fa7cb0d3a6a27c1fe2f26f6e60cfc0279ec248a1ebab69df3b57d5ae"} Jan 10 16:39:39 crc kubenswrapper[5036]: I0110 16:39:39.281497 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vvnr7" event={"ID":"afae39b8-393f-46d1-a436-512d9ba68c25","Type":"ContainerStarted","Data":"baa8180199dd885fd76789595cfaaccab030f5f17e4c118491fa8865c011408f"} Jan 10 16:39:39 crc kubenswrapper[5036]: I0110 16:39:39.281926 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:39 crc kubenswrapper[5036]: I0110 16:39:39.307631 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-vvnr7" podStartSLOduration=5.814295522 podStartE2EDuration="13.307617332s" podCreationTimestamp="2026-01-10 16:39:26 +0000 UTC" firstStartedPulling="2026-01-10 16:39:27.420341054 +0000 UTC m=+689.290576548" lastFinishedPulling="2026-01-10 16:39:34.913662864 +0000 UTC m=+696.783898358" observedRunningTime="2026-01-10 16:39:39.303802675 +0000 UTC m=+701.174038179" watchObservedRunningTime="2026-01-10 16:39:39.307617332 +0000 UTC m=+701.177852826" Jan 10 16:39:41 crc kubenswrapper[5036]: I0110 16:39:41.272727 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-9wf5v"] Jan 10 16:39:41 crc kubenswrapper[5036]: I0110 16:39:41.274179 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9wf5v" Jan 10 16:39:41 crc kubenswrapper[5036]: I0110 16:39:41.327162 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-zkd84" Jan 10 16:39:41 crc kubenswrapper[5036]: I0110 16:39:41.327218 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 10 16:39:41 crc kubenswrapper[5036]: I0110 16:39:41.327454 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 10 16:39:41 crc kubenswrapper[5036]: I0110 16:39:41.334131 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9wf5v"] Jan 10 16:39:41 crc kubenswrapper[5036]: I0110 16:39:41.431785 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87k5v\" (UniqueName: \"kubernetes.io/projected/e1236be2-97ae-4fe2-a09c-6e370099b77d-kube-api-access-87k5v\") pod \"openstack-operator-index-9wf5v\" (UID: \"e1236be2-97ae-4fe2-a09c-6e370099b77d\") " pod="openstack-operators/openstack-operator-index-9wf5v" Jan 10 16:39:41 crc kubenswrapper[5036]: I0110 16:39:41.533910 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87k5v\" (UniqueName: \"kubernetes.io/projected/e1236be2-97ae-4fe2-a09c-6e370099b77d-kube-api-access-87k5v\") pod \"openstack-operator-index-9wf5v\" (UID: \"e1236be2-97ae-4fe2-a09c-6e370099b77d\") " pod="openstack-operators/openstack-operator-index-9wf5v" Jan 10 16:39:41 crc kubenswrapper[5036]: I0110 16:39:41.565817 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87k5v\" (UniqueName: \"kubernetes.io/projected/e1236be2-97ae-4fe2-a09c-6e370099b77d-kube-api-access-87k5v\") pod \"openstack-operator-index-9wf5v\" (UID: \"e1236be2-97ae-4fe2-a09c-6e370099b77d\") " pod="openstack-operators/openstack-operator-index-9wf5v" Jan 10 16:39:41 crc kubenswrapper[5036]: I0110 16:39:41.654627 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9wf5v" Jan 10 16:39:41 crc kubenswrapper[5036]: I0110 16:39:41.843630 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9wf5v"] Jan 10 16:39:42 crc kubenswrapper[5036]: I0110 16:39:42.257065 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:42 crc kubenswrapper[5036]: I0110 16:39:42.293258 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:42 crc kubenswrapper[5036]: I0110 16:39:42.341103 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9wf5v" event={"ID":"e1236be2-97ae-4fe2-a09c-6e370099b77d","Type":"ContainerStarted","Data":"7534e7bd63533de34336c5d48788f6965a88e4b943f13b62c8cd8384ce6a9570"} Jan 10 16:39:44 crc kubenswrapper[5036]: I0110 16:39:44.357906 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9wf5v" event={"ID":"e1236be2-97ae-4fe2-a09c-6e370099b77d","Type":"ContainerStarted","Data":"b1a80d81956478214f7c59700217b7237019487130d9202c17f6b5396c355ef1"} Jan 10 16:39:44 crc kubenswrapper[5036]: I0110 16:39:44.389821 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-9wf5v" podStartSLOduration=1.446659214 podStartE2EDuration="3.389784859s" podCreationTimestamp="2026-01-10 16:39:41 +0000 UTC" firstStartedPulling="2026-01-10 16:39:41.855073345 +0000 UTC m=+703.725308839" lastFinishedPulling="2026-01-10 16:39:43.798199 +0000 UTC m=+705.668434484" observedRunningTime="2026-01-10 16:39:44.381988048 +0000 UTC m=+706.252223562" watchObservedRunningTime="2026-01-10 16:39:44.389784859 +0000 UTC m=+706.260020393" Jan 10 16:39:44 crc kubenswrapper[5036]: I0110 16:39:44.664112 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-9wf5v"] Jan 10 16:39:45 crc kubenswrapper[5036]: I0110 16:39:45.267204 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-sdc2j"] Jan 10 16:39:45 crc kubenswrapper[5036]: I0110 16:39:45.267923 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-sdc2j" Jan 10 16:39:45 crc kubenswrapper[5036]: I0110 16:39:45.278525 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-sdc2j"] Jan 10 16:39:45 crc kubenswrapper[5036]: I0110 16:39:45.391189 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc8m4\" (UniqueName: \"kubernetes.io/projected/3929902d-323d-44ec-84be-4069e262618f-kube-api-access-vc8m4\") pod \"openstack-operator-index-sdc2j\" (UID: \"3929902d-323d-44ec-84be-4069e262618f\") " pod="openstack-operators/openstack-operator-index-sdc2j" Jan 10 16:39:45 crc kubenswrapper[5036]: I0110 16:39:45.493917 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc8m4\" (UniqueName: \"kubernetes.io/projected/3929902d-323d-44ec-84be-4069e262618f-kube-api-access-vc8m4\") pod \"openstack-operator-index-sdc2j\" (UID: \"3929902d-323d-44ec-84be-4069e262618f\") " pod="openstack-operators/openstack-operator-index-sdc2j" Jan 10 16:39:45 crc kubenswrapper[5036]: I0110 16:39:45.528347 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc8m4\" (UniqueName: \"kubernetes.io/projected/3929902d-323d-44ec-84be-4069e262618f-kube-api-access-vc8m4\") pod \"openstack-operator-index-sdc2j\" (UID: \"3929902d-323d-44ec-84be-4069e262618f\") " pod="openstack-operators/openstack-operator-index-sdc2j" Jan 10 16:39:45 crc kubenswrapper[5036]: I0110 16:39:45.600101 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-sdc2j" Jan 10 16:39:45 crc kubenswrapper[5036]: I0110 16:39:45.838362 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-sdc2j"] Jan 10 16:39:45 crc kubenswrapper[5036]: W0110 16:39:45.847006 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3929902d_323d_44ec_84be_4069e262618f.slice/crio-59861a4ff121ef7d6f7dcaab4cc8ab7558fda966f77b7652c51a0d6413e90501 WatchSource:0}: Error finding container 59861a4ff121ef7d6f7dcaab4cc8ab7558fda966f77b7652c51a0d6413e90501: Status 404 returned error can't find the container with id 59861a4ff121ef7d6f7dcaab4cc8ab7558fda966f77b7652c51a0d6413e90501 Jan 10 16:39:46 crc kubenswrapper[5036]: I0110 16:39:46.372865 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-sdc2j" event={"ID":"3929902d-323d-44ec-84be-4069e262618f","Type":"ContainerStarted","Data":"e5c0febce2464fbd63a0fa9f8269afec552e80eb64defa38a526b813f628d935"} Jan 10 16:39:46 crc kubenswrapper[5036]: I0110 16:39:46.373261 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-sdc2j" event={"ID":"3929902d-323d-44ec-84be-4069e262618f","Type":"ContainerStarted","Data":"59861a4ff121ef7d6f7dcaab4cc8ab7558fda966f77b7652c51a0d6413e90501"} Jan 10 16:39:46 crc kubenswrapper[5036]: I0110 16:39:46.372952 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-9wf5v" podUID="e1236be2-97ae-4fe2-a09c-6e370099b77d" containerName="registry-server" containerID="cri-o://b1a80d81956478214f7c59700217b7237019487130d9202c17f6b5396c355ef1" gracePeriod=2 Jan 10 16:39:46 crc kubenswrapper[5036]: I0110 16:39:46.395127 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-sdc2j" podStartSLOduration=1.281487831 podStartE2EDuration="1.39510094s" podCreationTimestamp="2026-01-10 16:39:45 +0000 UTC" firstStartedPulling="2026-01-10 16:39:45.85129388 +0000 UTC m=+707.721529374" lastFinishedPulling="2026-01-10 16:39:45.964906989 +0000 UTC m=+707.835142483" observedRunningTime="2026-01-10 16:39:46.392303931 +0000 UTC m=+708.262539435" watchObservedRunningTime="2026-01-10 16:39:46.39510094 +0000 UTC m=+708.265336444" Jan 10 16:39:46 crc kubenswrapper[5036]: I0110 16:39:46.748449 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9wf5v" Jan 10 16:39:46 crc kubenswrapper[5036]: I0110 16:39:46.812793 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5bddd4b946-5xsxl" Jan 10 16:39:46 crc kubenswrapper[5036]: I0110 16:39:46.914978 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87k5v\" (UniqueName: \"kubernetes.io/projected/e1236be2-97ae-4fe2-a09c-6e370099b77d-kube-api-access-87k5v\") pod \"e1236be2-97ae-4fe2-a09c-6e370099b77d\" (UID: \"e1236be2-97ae-4fe2-a09c-6e370099b77d\") " Jan 10 16:39:46 crc kubenswrapper[5036]: I0110 16:39:46.920816 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1236be2-97ae-4fe2-a09c-6e370099b77d-kube-api-access-87k5v" (OuterVolumeSpecName: "kube-api-access-87k5v") pod "e1236be2-97ae-4fe2-a09c-6e370099b77d" (UID: "e1236be2-97ae-4fe2-a09c-6e370099b77d"). InnerVolumeSpecName "kube-api-access-87k5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:39:47 crc kubenswrapper[5036]: I0110 16:39:47.017020 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87k5v\" (UniqueName: \"kubernetes.io/projected/e1236be2-97ae-4fe2-a09c-6e370099b77d-kube-api-access-87k5v\") on node \"crc\" DevicePath \"\"" Jan 10 16:39:47 crc kubenswrapper[5036]: I0110 16:39:47.263058 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-vvnr7" Jan 10 16:39:47 crc kubenswrapper[5036]: I0110 16:39:47.269179 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-qzgnv" Jan 10 16:39:47 crc kubenswrapper[5036]: I0110 16:39:47.385605 5036 generic.go:334] "Generic (PLEG): container finished" podID="e1236be2-97ae-4fe2-a09c-6e370099b77d" containerID="b1a80d81956478214f7c59700217b7237019487130d9202c17f6b5396c355ef1" exitCode=0 Jan 10 16:39:47 crc kubenswrapper[5036]: I0110 16:39:47.385737 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9wf5v" Jan 10 16:39:47 crc kubenswrapper[5036]: I0110 16:39:47.385753 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9wf5v" event={"ID":"e1236be2-97ae-4fe2-a09c-6e370099b77d","Type":"ContainerDied","Data":"b1a80d81956478214f7c59700217b7237019487130d9202c17f6b5396c355ef1"} Jan 10 16:39:47 crc kubenswrapper[5036]: I0110 16:39:47.386175 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9wf5v" event={"ID":"e1236be2-97ae-4fe2-a09c-6e370099b77d","Type":"ContainerDied","Data":"7534e7bd63533de34336c5d48788f6965a88e4b943f13b62c8cd8384ce6a9570"} Jan 10 16:39:47 crc kubenswrapper[5036]: I0110 16:39:47.386199 5036 scope.go:117] "RemoveContainer" containerID="b1a80d81956478214f7c59700217b7237019487130d9202c17f6b5396c355ef1" Jan 10 16:39:47 crc kubenswrapper[5036]: I0110 16:39:47.417071 5036 scope.go:117] "RemoveContainer" containerID="b1a80d81956478214f7c59700217b7237019487130d9202c17f6b5396c355ef1" Jan 10 16:39:47 crc kubenswrapper[5036]: E0110 16:39:47.417610 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1a80d81956478214f7c59700217b7237019487130d9202c17f6b5396c355ef1\": container with ID starting with b1a80d81956478214f7c59700217b7237019487130d9202c17f6b5396c355ef1 not found: ID does not exist" containerID="b1a80d81956478214f7c59700217b7237019487130d9202c17f6b5396c355ef1" Jan 10 16:39:47 crc kubenswrapper[5036]: I0110 16:39:47.417691 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1a80d81956478214f7c59700217b7237019487130d9202c17f6b5396c355ef1"} err="failed to get container status \"b1a80d81956478214f7c59700217b7237019487130d9202c17f6b5396c355ef1\": rpc error: code = NotFound desc = could not find container \"b1a80d81956478214f7c59700217b7237019487130d9202c17f6b5396c355ef1\": container with ID starting with b1a80d81956478214f7c59700217b7237019487130d9202c17f6b5396c355ef1 not found: ID does not exist" Jan 10 16:39:47 crc kubenswrapper[5036]: I0110 16:39:47.420329 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-9wf5v"] Jan 10 16:39:47 crc kubenswrapper[5036]: I0110 16:39:47.426423 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-9wf5v"] Jan 10 16:39:48 crc kubenswrapper[5036]: I0110 16:39:48.520981 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1236be2-97ae-4fe2-a09c-6e370099b77d" path="/var/lib/kubelet/pods/e1236be2-97ae-4fe2-a09c-6e370099b77d/volumes" Jan 10 16:39:55 crc kubenswrapper[5036]: I0110 16:39:55.601123 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-sdc2j" Jan 10 16:39:55 crc kubenswrapper[5036]: I0110 16:39:55.601692 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-sdc2j" Jan 10 16:39:55 crc kubenswrapper[5036]: I0110 16:39:55.631721 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-sdc2j" Jan 10 16:39:56 crc kubenswrapper[5036]: I0110 16:39:56.484188 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-sdc2j" Jan 10 16:40:00 crc kubenswrapper[5036]: I0110 16:40:00.735454 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx"] Jan 10 16:40:00 crc kubenswrapper[5036]: E0110 16:40:00.736986 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1236be2-97ae-4fe2-a09c-6e370099b77d" containerName="registry-server" Jan 10 16:40:00 crc kubenswrapper[5036]: I0110 16:40:00.737003 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1236be2-97ae-4fe2-a09c-6e370099b77d" containerName="registry-server" Jan 10 16:40:00 crc kubenswrapper[5036]: I0110 16:40:00.737146 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1236be2-97ae-4fe2-a09c-6e370099b77d" containerName="registry-server" Jan 10 16:40:00 crc kubenswrapper[5036]: I0110 16:40:00.742638 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx" Jan 10 16:40:00 crc kubenswrapper[5036]: I0110 16:40:00.744117 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx"] Jan 10 16:40:00 crc kubenswrapper[5036]: I0110 16:40:00.746642 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-pxmh5" Jan 10 16:40:00 crc kubenswrapper[5036]: I0110 16:40:00.842200 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b93cb83a-a272-4416-bff9-4da9aeb4f412-bundle\") pod \"55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx\" (UID: \"b93cb83a-a272-4416-bff9-4da9aeb4f412\") " pod="openstack-operators/55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx" Jan 10 16:40:00 crc kubenswrapper[5036]: I0110 16:40:00.842345 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2824\" (UniqueName: \"kubernetes.io/projected/b93cb83a-a272-4416-bff9-4da9aeb4f412-kube-api-access-x2824\") pod \"55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx\" (UID: \"b93cb83a-a272-4416-bff9-4da9aeb4f412\") " pod="openstack-operators/55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx" Jan 10 16:40:00 crc kubenswrapper[5036]: I0110 16:40:00.842422 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b93cb83a-a272-4416-bff9-4da9aeb4f412-util\") pod \"55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx\" (UID: \"b93cb83a-a272-4416-bff9-4da9aeb4f412\") " pod="openstack-operators/55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx" Jan 10 16:40:00 crc kubenswrapper[5036]: I0110 16:40:00.944847 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b93cb83a-a272-4416-bff9-4da9aeb4f412-bundle\") pod \"55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx\" (UID: \"b93cb83a-a272-4416-bff9-4da9aeb4f412\") " pod="openstack-operators/55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx" Jan 10 16:40:00 crc kubenswrapper[5036]: I0110 16:40:00.944952 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2824\" (UniqueName: \"kubernetes.io/projected/b93cb83a-a272-4416-bff9-4da9aeb4f412-kube-api-access-x2824\") pod \"55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx\" (UID: \"b93cb83a-a272-4416-bff9-4da9aeb4f412\") " pod="openstack-operators/55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx" Jan 10 16:40:00 crc kubenswrapper[5036]: I0110 16:40:00.945009 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b93cb83a-a272-4416-bff9-4da9aeb4f412-util\") pod \"55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx\" (UID: \"b93cb83a-a272-4416-bff9-4da9aeb4f412\") " pod="openstack-operators/55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx" Jan 10 16:40:00 crc kubenswrapper[5036]: I0110 16:40:00.945605 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b93cb83a-a272-4416-bff9-4da9aeb4f412-bundle\") pod \"55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx\" (UID: \"b93cb83a-a272-4416-bff9-4da9aeb4f412\") " pod="openstack-operators/55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx" Jan 10 16:40:00 crc kubenswrapper[5036]: I0110 16:40:00.945624 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b93cb83a-a272-4416-bff9-4da9aeb4f412-util\") pod \"55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx\" (UID: \"b93cb83a-a272-4416-bff9-4da9aeb4f412\") " pod="openstack-operators/55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx" Jan 10 16:40:00 crc kubenswrapper[5036]: I0110 16:40:00.980486 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2824\" (UniqueName: \"kubernetes.io/projected/b93cb83a-a272-4416-bff9-4da9aeb4f412-kube-api-access-x2824\") pod \"55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx\" (UID: \"b93cb83a-a272-4416-bff9-4da9aeb4f412\") " pod="openstack-operators/55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx" Jan 10 16:40:01 crc kubenswrapper[5036]: I0110 16:40:01.072007 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx" Jan 10 16:40:01 crc kubenswrapper[5036]: I0110 16:40:01.291237 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx"] Jan 10 16:40:01 crc kubenswrapper[5036]: W0110 16:40:01.298769 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb93cb83a_a272_4416_bff9_4da9aeb4f412.slice/crio-c2c30a138062b8c68a580337f4c5ed4cee4af32692c6bb86db07a13f3d1bb273 WatchSource:0}: Error finding container c2c30a138062b8c68a580337f4c5ed4cee4af32692c6bb86db07a13f3d1bb273: Status 404 returned error can't find the container with id c2c30a138062b8c68a580337f4c5ed4cee4af32692c6bb86db07a13f3d1bb273 Jan 10 16:40:01 crc kubenswrapper[5036]: I0110 16:40:01.491882 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx" event={"ID":"b93cb83a-a272-4416-bff9-4da9aeb4f412","Type":"ContainerStarted","Data":"1b7450de36a903f5595a7c35fd99624af211deb1937a7966ee8a5ddc3c0a15ec"} Jan 10 16:40:01 crc kubenswrapper[5036]: I0110 16:40:01.492314 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx" event={"ID":"b93cb83a-a272-4416-bff9-4da9aeb4f412","Type":"ContainerStarted","Data":"c2c30a138062b8c68a580337f4c5ed4cee4af32692c6bb86db07a13f3d1bb273"} Jan 10 16:40:02 crc kubenswrapper[5036]: I0110 16:40:02.500025 5036 generic.go:334] "Generic (PLEG): container finished" podID="b93cb83a-a272-4416-bff9-4da9aeb4f412" containerID="1b7450de36a903f5595a7c35fd99624af211deb1937a7966ee8a5ddc3c0a15ec" exitCode=0 Jan 10 16:40:02 crc kubenswrapper[5036]: I0110 16:40:02.500097 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx" event={"ID":"b93cb83a-a272-4416-bff9-4da9aeb4f412","Type":"ContainerDied","Data":"1b7450de36a903f5595a7c35fd99624af211deb1937a7966ee8a5ddc3c0a15ec"} Jan 10 16:40:04 crc kubenswrapper[5036]: I0110 16:40:04.517789 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx" event={"ID":"b93cb83a-a272-4416-bff9-4da9aeb4f412","Type":"ContainerStarted","Data":"a0a9fa9992334ba958a4c5c4e227d203d26f754a9f9fbb1275c668fd6e1c058c"} Jan 10 16:40:05 crc kubenswrapper[5036]: I0110 16:40:05.526020 5036 generic.go:334] "Generic (PLEG): container finished" podID="b93cb83a-a272-4416-bff9-4da9aeb4f412" containerID="a0a9fa9992334ba958a4c5c4e227d203d26f754a9f9fbb1275c668fd6e1c058c" exitCode=0 Jan 10 16:40:05 crc kubenswrapper[5036]: I0110 16:40:05.526066 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx" event={"ID":"b93cb83a-a272-4416-bff9-4da9aeb4f412","Type":"ContainerDied","Data":"a0a9fa9992334ba958a4c5c4e227d203d26f754a9f9fbb1275c668fd6e1c058c"} Jan 10 16:40:06 crc kubenswrapper[5036]: I0110 16:40:06.536019 5036 generic.go:334] "Generic (PLEG): container finished" podID="b93cb83a-a272-4416-bff9-4da9aeb4f412" containerID="3a5070cebc98e52901a432f8ddb680e7904cece90ae0e960bbb419ddfc0a9eef" exitCode=0 Jan 10 16:40:06 crc kubenswrapper[5036]: I0110 16:40:06.536082 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx" event={"ID":"b93cb83a-a272-4416-bff9-4da9aeb4f412","Type":"ContainerDied","Data":"3a5070cebc98e52901a432f8ddb680e7904cece90ae0e960bbb419ddfc0a9eef"} Jan 10 16:40:07 crc kubenswrapper[5036]: I0110 16:40:07.839998 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx" Jan 10 16:40:07 crc kubenswrapper[5036]: I0110 16:40:07.961417 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b93cb83a-a272-4416-bff9-4da9aeb4f412-util\") pod \"b93cb83a-a272-4416-bff9-4da9aeb4f412\" (UID: \"b93cb83a-a272-4416-bff9-4da9aeb4f412\") " Jan 10 16:40:07 crc kubenswrapper[5036]: I0110 16:40:07.961507 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2824\" (UniqueName: \"kubernetes.io/projected/b93cb83a-a272-4416-bff9-4da9aeb4f412-kube-api-access-x2824\") pod \"b93cb83a-a272-4416-bff9-4da9aeb4f412\" (UID: \"b93cb83a-a272-4416-bff9-4da9aeb4f412\") " Jan 10 16:40:07 crc kubenswrapper[5036]: I0110 16:40:07.961539 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b93cb83a-a272-4416-bff9-4da9aeb4f412-bundle\") pod \"b93cb83a-a272-4416-bff9-4da9aeb4f412\" (UID: \"b93cb83a-a272-4416-bff9-4da9aeb4f412\") " Jan 10 16:40:07 crc kubenswrapper[5036]: I0110 16:40:07.962740 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b93cb83a-a272-4416-bff9-4da9aeb4f412-bundle" (OuterVolumeSpecName: "bundle") pod "b93cb83a-a272-4416-bff9-4da9aeb4f412" (UID: "b93cb83a-a272-4416-bff9-4da9aeb4f412"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:40:07 crc kubenswrapper[5036]: I0110 16:40:07.966997 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b93cb83a-a272-4416-bff9-4da9aeb4f412-kube-api-access-x2824" (OuterVolumeSpecName: "kube-api-access-x2824") pod "b93cb83a-a272-4416-bff9-4da9aeb4f412" (UID: "b93cb83a-a272-4416-bff9-4da9aeb4f412"). InnerVolumeSpecName "kube-api-access-x2824". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:40:07 crc kubenswrapper[5036]: I0110 16:40:07.983439 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b93cb83a-a272-4416-bff9-4da9aeb4f412-util" (OuterVolumeSpecName: "util") pod "b93cb83a-a272-4416-bff9-4da9aeb4f412" (UID: "b93cb83a-a272-4416-bff9-4da9aeb4f412"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:40:08 crc kubenswrapper[5036]: I0110 16:40:08.063043 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2824\" (UniqueName: \"kubernetes.io/projected/b93cb83a-a272-4416-bff9-4da9aeb4f412-kube-api-access-x2824\") on node \"crc\" DevicePath \"\"" Jan 10 16:40:08 crc kubenswrapper[5036]: I0110 16:40:08.063915 5036 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b93cb83a-a272-4416-bff9-4da9aeb4f412-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:40:08 crc kubenswrapper[5036]: I0110 16:40:08.064122 5036 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b93cb83a-a272-4416-bff9-4da9aeb4f412-util\") on node \"crc\" DevicePath \"\"" Jan 10 16:40:08 crc kubenswrapper[5036]: I0110 16:40:08.553084 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx" event={"ID":"b93cb83a-a272-4416-bff9-4da9aeb4f412","Type":"ContainerDied","Data":"c2c30a138062b8c68a580337f4c5ed4cee4af32692c6bb86db07a13f3d1bb273"} Jan 10 16:40:08 crc kubenswrapper[5036]: I0110 16:40:08.553120 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx" Jan 10 16:40:08 crc kubenswrapper[5036]: I0110 16:40:08.553123 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2c30a138062b8c68a580337f4c5ed4cee4af32692c6bb86db07a13f3d1bb273" Jan 10 16:40:13 crc kubenswrapper[5036]: I0110 16:40:13.654792 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5d4cd6578d-pt5gl"] Jan 10 16:40:13 crc kubenswrapper[5036]: E0110 16:40:13.655579 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b93cb83a-a272-4416-bff9-4da9aeb4f412" containerName="extract" Jan 10 16:40:13 crc kubenswrapper[5036]: I0110 16:40:13.655595 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="b93cb83a-a272-4416-bff9-4da9aeb4f412" containerName="extract" Jan 10 16:40:13 crc kubenswrapper[5036]: E0110 16:40:13.655603 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b93cb83a-a272-4416-bff9-4da9aeb4f412" containerName="util" Jan 10 16:40:13 crc kubenswrapper[5036]: I0110 16:40:13.655613 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="b93cb83a-a272-4416-bff9-4da9aeb4f412" containerName="util" Jan 10 16:40:13 crc kubenswrapper[5036]: E0110 16:40:13.655636 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b93cb83a-a272-4416-bff9-4da9aeb4f412" containerName="pull" Jan 10 16:40:13 crc kubenswrapper[5036]: I0110 16:40:13.655642 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="b93cb83a-a272-4416-bff9-4da9aeb4f412" containerName="pull" Jan 10 16:40:13 crc kubenswrapper[5036]: I0110 16:40:13.655808 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="b93cb83a-a272-4416-bff9-4da9aeb4f412" containerName="extract" Jan 10 16:40:13 crc kubenswrapper[5036]: I0110 16:40:13.656398 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5d4cd6578d-pt5gl" Jan 10 16:40:13 crc kubenswrapper[5036]: I0110 16:40:13.660131 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-qqbkn" Jan 10 16:40:13 crc kubenswrapper[5036]: I0110 16:40:13.690936 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5d4cd6578d-pt5gl"] Jan 10 16:40:13 crc kubenswrapper[5036]: I0110 16:40:13.747509 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hh4c\" (UniqueName: \"kubernetes.io/projected/0ab4dccd-a4ff-49f6-96bf-a7150425ff15-kube-api-access-7hh4c\") pod \"openstack-operator-controller-operator-5d4cd6578d-pt5gl\" (UID: \"0ab4dccd-a4ff-49f6-96bf-a7150425ff15\") " pod="openstack-operators/openstack-operator-controller-operator-5d4cd6578d-pt5gl" Jan 10 16:40:13 crc kubenswrapper[5036]: I0110 16:40:13.848645 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hh4c\" (UniqueName: \"kubernetes.io/projected/0ab4dccd-a4ff-49f6-96bf-a7150425ff15-kube-api-access-7hh4c\") pod \"openstack-operator-controller-operator-5d4cd6578d-pt5gl\" (UID: \"0ab4dccd-a4ff-49f6-96bf-a7150425ff15\") " pod="openstack-operators/openstack-operator-controller-operator-5d4cd6578d-pt5gl" Jan 10 16:40:13 crc kubenswrapper[5036]: I0110 16:40:13.872539 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hh4c\" (UniqueName: \"kubernetes.io/projected/0ab4dccd-a4ff-49f6-96bf-a7150425ff15-kube-api-access-7hh4c\") pod \"openstack-operator-controller-operator-5d4cd6578d-pt5gl\" (UID: \"0ab4dccd-a4ff-49f6-96bf-a7150425ff15\") " pod="openstack-operators/openstack-operator-controller-operator-5d4cd6578d-pt5gl" Jan 10 16:40:13 crc kubenswrapper[5036]: I0110 16:40:13.975485 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5d4cd6578d-pt5gl" Jan 10 16:40:14 crc kubenswrapper[5036]: I0110 16:40:14.209775 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5d4cd6578d-pt5gl"] Jan 10 16:40:14 crc kubenswrapper[5036]: I0110 16:40:14.600183 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5d4cd6578d-pt5gl" event={"ID":"0ab4dccd-a4ff-49f6-96bf-a7150425ff15","Type":"ContainerStarted","Data":"a7ca87b7717d2e2082395b8f9001f4c448eef3a7691404bb20b44cb859dc6bf8"} Jan 10 16:40:18 crc kubenswrapper[5036]: I0110 16:40:18.629136 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5d4cd6578d-pt5gl" event={"ID":"0ab4dccd-a4ff-49f6-96bf-a7150425ff15","Type":"ContainerStarted","Data":"998005ef8704be94b4ce77db1f1f19e86e068eb047d1ce34a104e58f96600c4e"} Jan 10 16:40:18 crc kubenswrapper[5036]: I0110 16:40:18.629739 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-5d4cd6578d-pt5gl" Jan 10 16:40:18 crc kubenswrapper[5036]: I0110 16:40:18.654628 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-5d4cd6578d-pt5gl" podStartSLOduration=2.094258939 podStartE2EDuration="5.654610299s" podCreationTimestamp="2026-01-10 16:40:13 +0000 UTC" firstStartedPulling="2026-01-10 16:40:14.215117783 +0000 UTC m=+736.085353277" lastFinishedPulling="2026-01-10 16:40:17.775469143 +0000 UTC m=+739.645704637" observedRunningTime="2026-01-10 16:40:18.651082449 +0000 UTC m=+740.521317943" watchObservedRunningTime="2026-01-10 16:40:18.654610299 +0000 UTC m=+740.524845783" Jan 10 16:40:23 crc kubenswrapper[5036]: I0110 16:40:23.979062 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-5d4cd6578d-pt5gl" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.311222 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-678b8c6d96-568pc"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.312722 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-678b8c6d96-568pc" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.315735 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p55t\" (UniqueName: \"kubernetes.io/projected/a17f3d4e-41a9-4941-83f6-090808b6cb29-kube-api-access-5p55t\") pod \"barbican-operator-controller-manager-678b8c6d96-568pc\" (UID: \"a17f3d4e-41a9-4941-83f6-090808b6cb29\") " pod="openstack-operators/barbican-operator-controller-manager-678b8c6d96-568pc" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.319732 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-f27xs" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.362447 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-678b8c6d96-568pc"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.366498 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-78979fc445-2qq47"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.367495 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-78979fc445-2qq47" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.370302 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66f8b87655-trzdf"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.371442 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-trzdf" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.372166 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-7kmrs" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.376153 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-krhkk" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.378288 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-78979fc445-2qq47"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.382934 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66f8b87655-trzdf"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.396757 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5967c8645c-cbdjv"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.397609 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5967c8645c-cbdjv" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.406225 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-65c54c675d-ng9ld"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.406888 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-65c54c675d-ng9ld" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.408899 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-5m4qs" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.409408 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-z2kkz" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.421236 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj892\" (UniqueName: \"kubernetes.io/projected/f1b7f315-826c-4a66-9919-69b3c75a648e-kube-api-access-hj892\") pod \"cinder-operator-controller-manager-78979fc445-2qq47\" (UID: \"f1b7f315-826c-4a66-9919-69b3c75a648e\") " pod="openstack-operators/cinder-operator-controller-manager-78979fc445-2qq47" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.421279 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6lmk\" (UniqueName: \"kubernetes.io/projected/ecf84720-507a-4a26-8326-7ed56754871e-kube-api-access-g6lmk\") pod \"heat-operator-controller-manager-65c54c675d-ng9ld\" (UID: \"ecf84720-507a-4a26-8326-7ed56754871e\") " pod="openstack-operators/heat-operator-controller-manager-65c54c675d-ng9ld" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.421308 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p55t\" (UniqueName: \"kubernetes.io/projected/a17f3d4e-41a9-4941-83f6-090808b6cb29-kube-api-access-5p55t\") pod \"barbican-operator-controller-manager-678b8c6d96-568pc\" (UID: \"a17f3d4e-41a9-4941-83f6-090808b6cb29\") " pod="openstack-operators/barbican-operator-controller-manager-678b8c6d96-568pc" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.421343 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2vzz\" (UniqueName: \"kubernetes.io/projected/09239a1e-ce39-49e7-a532-f7c353022176-kube-api-access-q2vzz\") pod \"glance-operator-controller-manager-5967c8645c-cbdjv\" (UID: \"09239a1e-ce39-49e7-a532-f7c353022176\") " pod="openstack-operators/glance-operator-controller-manager-5967c8645c-cbdjv" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.421376 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rms4\" (UniqueName: \"kubernetes.io/projected/52b19fea-05ac-4448-9446-33fbee11b2da-kube-api-access-6rms4\") pod \"designate-operator-controller-manager-66f8b87655-trzdf\" (UID: \"52b19fea-05ac-4448-9446-33fbee11b2da\") " pod="openstack-operators/designate-operator-controller-manager-66f8b87655-trzdf" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.448343 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7998b4cc7b-bjxnm"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.448998 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7998b4cc7b-bjxnm" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.453355 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p55t\" (UniqueName: \"kubernetes.io/projected/a17f3d4e-41a9-4941-83f6-090808b6cb29-kube-api-access-5p55t\") pod \"barbican-operator-controller-manager-678b8c6d96-568pc\" (UID: \"a17f3d4e-41a9-4941-83f6-090808b6cb29\") " pod="openstack-operators/barbican-operator-controller-manager-678b8c6d96-568pc" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.466598 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-cdwwt" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.474736 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5967c8645c-cbdjv"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.489286 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-65c54c675d-ng9ld"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.510738 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7998b4cc7b-bjxnm"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.527598 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6lmk\" (UniqueName: \"kubernetes.io/projected/ecf84720-507a-4a26-8326-7ed56754871e-kube-api-access-g6lmk\") pod \"heat-operator-controller-manager-65c54c675d-ng9ld\" (UID: \"ecf84720-507a-4a26-8326-7ed56754871e\") " pod="openstack-operators/heat-operator-controller-manager-65c54c675d-ng9ld" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.527931 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2vzz\" (UniqueName: \"kubernetes.io/projected/09239a1e-ce39-49e7-a532-f7c353022176-kube-api-access-q2vzz\") pod \"glance-operator-controller-manager-5967c8645c-cbdjv\" (UID: \"09239a1e-ce39-49e7-a532-f7c353022176\") " pod="openstack-operators/glance-operator-controller-manager-5967c8645c-cbdjv" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.528005 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnbpw\" (UniqueName: \"kubernetes.io/projected/58ba757d-493c-4a4c-9aaa-a3178272b7cb-kube-api-access-rnbpw\") pod \"horizon-operator-controller-manager-7998b4cc7b-bjxnm\" (UID: \"58ba757d-493c-4a4c-9aaa-a3178272b7cb\") " pod="openstack-operators/horizon-operator-controller-manager-7998b4cc7b-bjxnm" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.528050 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rms4\" (UniqueName: \"kubernetes.io/projected/52b19fea-05ac-4448-9446-33fbee11b2da-kube-api-access-6rms4\") pod \"designate-operator-controller-manager-66f8b87655-trzdf\" (UID: \"52b19fea-05ac-4448-9446-33fbee11b2da\") " pod="openstack-operators/designate-operator-controller-manager-66f8b87655-trzdf" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.528150 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj892\" (UniqueName: \"kubernetes.io/projected/f1b7f315-826c-4a66-9919-69b3c75a648e-kube-api-access-hj892\") pod \"cinder-operator-controller-manager-78979fc445-2qq47\" (UID: \"f1b7f315-826c-4a66-9919-69b3c75a648e\") " pod="openstack-operators/cinder-operator-controller-manager-78979fc445-2qq47" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.584947 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rms4\" (UniqueName: \"kubernetes.io/projected/52b19fea-05ac-4448-9446-33fbee11b2da-kube-api-access-6rms4\") pod \"designate-operator-controller-manager-66f8b87655-trzdf\" (UID: \"52b19fea-05ac-4448-9446-33fbee11b2da\") " pod="openstack-operators/designate-operator-controller-manager-66f8b87655-trzdf" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.584982 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-xcmds"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.585646 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xcmds" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.589217 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj892\" (UniqueName: \"kubernetes.io/projected/f1b7f315-826c-4a66-9919-69b3c75a648e-kube-api-access-hj892\") pod \"cinder-operator-controller-manager-78979fc445-2qq47\" (UID: \"f1b7f315-826c-4a66-9919-69b3c75a648e\") " pod="openstack-operators/cinder-operator-controller-manager-78979fc445-2qq47" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.591258 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6lmk\" (UniqueName: \"kubernetes.io/projected/ecf84720-507a-4a26-8326-7ed56754871e-kube-api-access-g6lmk\") pod \"heat-operator-controller-manager-65c54c675d-ng9ld\" (UID: \"ecf84720-507a-4a26-8326-7ed56754871e\") " pod="openstack-operators/heat-operator-controller-manager-65c54c675d-ng9ld" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.605783 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-568985c78-cs2b2"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.606484 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-568985c78-cs2b2" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.619384 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-xcmds"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.619769 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.624157 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5tc7m" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.631229 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-7xt8r" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.632468 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2vzz\" (UniqueName: \"kubernetes.io/projected/09239a1e-ce39-49e7-a532-f7c353022176-kube-api-access-q2vzz\") pod \"glance-operator-controller-manager-5967c8645c-cbdjv\" (UID: \"09239a1e-ce39-49e7-a532-f7c353022176\") " pod="openstack-operators/glance-operator-controller-manager-5967c8645c-cbdjv" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.636962 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-678b8c6d96-568pc" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.637976 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gmxq\" (UniqueName: \"kubernetes.io/projected/80ddf12b-ee61-4d6f-a3fb-ff9aded793d7-kube-api-access-9gmxq\") pod \"infra-operator-controller-manager-77c48c7859-xcmds\" (UID: \"80ddf12b-ee61-4d6f-a3fb-ff9aded793d7\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xcmds" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.638030 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80ddf12b-ee61-4d6f-a3fb-ff9aded793d7-cert\") pod \"infra-operator-controller-manager-77c48c7859-xcmds\" (UID: \"80ddf12b-ee61-4d6f-a3fb-ff9aded793d7\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xcmds" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.638062 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnbpw\" (UniqueName: \"kubernetes.io/projected/58ba757d-493c-4a4c-9aaa-a3178272b7cb-kube-api-access-rnbpw\") pod \"horizon-operator-controller-manager-7998b4cc7b-bjxnm\" (UID: \"58ba757d-493c-4a4c-9aaa-a3178272b7cb\") " pod="openstack-operators/horizon-operator-controller-manager-7998b4cc7b-bjxnm" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.638089 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2ls7\" (UniqueName: \"kubernetes.io/projected/611b3f4f-0b6d-4ef9-b040-eba991c4bfe4-kube-api-access-l2ls7\") pod \"keystone-operator-controller-manager-568985c78-cs2b2\" (UID: \"611b3f4f-0b6d-4ef9-b040-eba991c4bfe4\") " pod="openstack-operators/keystone-operator-controller-manager-568985c78-cs2b2" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.642734 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5b47c74dd5-skh8x"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.643487 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5b47c74dd5-skh8x" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.645445 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-s79pw" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.658790 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-568985c78-cs2b2"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.690350 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnbpw\" (UniqueName: \"kubernetes.io/projected/58ba757d-493c-4a4c-9aaa-a3178272b7cb-kube-api-access-rnbpw\") pod \"horizon-operator-controller-manager-7998b4cc7b-bjxnm\" (UID: \"58ba757d-493c-4a4c-9aaa-a3178272b7cb\") " pod="openstack-operators/horizon-operator-controller-manager-7998b4cc7b-bjxnm" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.690614 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-78979fc445-2qq47" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.697290 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-598945d5b8-l8ggv"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.698034 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-598945d5b8-l8ggv" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.702351 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-9qnhg" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.717097 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-trzdf" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.718787 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5b47c74dd5-skh8x"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.739019 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5967c8645c-cbdjv" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.741298 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-598945d5b8-l8ggv"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.747910 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2xkg\" (UniqueName: \"kubernetes.io/projected/552c1d94-e289-46e0-8756-58982a7cdc4c-kube-api-access-q2xkg\") pod \"manila-operator-controller-manager-598945d5b8-l8ggv\" (UID: \"552c1d94-e289-46e0-8756-58982a7cdc4c\") " pod="openstack-operators/manila-operator-controller-manager-598945d5b8-l8ggv" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.747954 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2ls7\" (UniqueName: \"kubernetes.io/projected/611b3f4f-0b6d-4ef9-b040-eba991c4bfe4-kube-api-access-l2ls7\") pod \"keystone-operator-controller-manager-568985c78-cs2b2\" (UID: \"611b3f4f-0b6d-4ef9-b040-eba991c4bfe4\") " pod="openstack-operators/keystone-operator-controller-manager-568985c78-cs2b2" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.748049 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gmxq\" (UniqueName: \"kubernetes.io/projected/80ddf12b-ee61-4d6f-a3fb-ff9aded793d7-kube-api-access-9gmxq\") pod \"infra-operator-controller-manager-77c48c7859-xcmds\" (UID: \"80ddf12b-ee61-4d6f-a3fb-ff9aded793d7\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xcmds" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.749338 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frxjm\" (UniqueName: \"kubernetes.io/projected/b4905be6-774a-4952-b195-f755688c7b26-kube-api-access-frxjm\") pod \"ironic-operator-controller-manager-5b47c74dd5-skh8x\" (UID: \"b4905be6-774a-4952-b195-f755688c7b26\") " pod="openstack-operators/ironic-operator-controller-manager-5b47c74dd5-skh8x" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.749385 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80ddf12b-ee61-4d6f-a3fb-ff9aded793d7-cert\") pod \"infra-operator-controller-manager-77c48c7859-xcmds\" (UID: \"80ddf12b-ee61-4d6f-a3fb-ff9aded793d7\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xcmds" Jan 10 16:40:43 crc kubenswrapper[5036]: E0110 16:40:43.749530 5036 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 10 16:40:43 crc kubenswrapper[5036]: E0110 16:40:43.749581 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80ddf12b-ee61-4d6f-a3fb-ff9aded793d7-cert podName:80ddf12b-ee61-4d6f-a3fb-ff9aded793d7 nodeName:}" failed. No retries permitted until 2026-01-10 16:40:44.249562831 +0000 UTC m=+766.119798315 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/80ddf12b-ee61-4d6f-a3fb-ff9aded793d7-cert") pod "infra-operator-controller-manager-77c48c7859-xcmds" (UID: "80ddf12b-ee61-4d6f-a3fb-ff9aded793d7") : secret "infra-operator-webhook-server-cert" not found Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.775285 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-746ccdd857-kkjhp"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.776069 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-746ccdd857-kkjhp" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.813153 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-pcpth" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.816466 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gmxq\" (UniqueName: \"kubernetes.io/projected/80ddf12b-ee61-4d6f-a3fb-ff9aded793d7-kube-api-access-9gmxq\") pod \"infra-operator-controller-manager-77c48c7859-xcmds\" (UID: \"80ddf12b-ee61-4d6f-a3fb-ff9aded793d7\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xcmds" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.827939 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-65c54c675d-ng9ld" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.847799 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7998b4cc7b-bjxnm" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.849223 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2ls7\" (UniqueName: \"kubernetes.io/projected/611b3f4f-0b6d-4ef9-b040-eba991c4bfe4-kube-api-access-l2ls7\") pod \"keystone-operator-controller-manager-568985c78-cs2b2\" (UID: \"611b3f4f-0b6d-4ef9-b040-eba991c4bfe4\") " pod="openstack-operators/keystone-operator-controller-manager-568985c78-cs2b2" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.904203 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frxjm\" (UniqueName: \"kubernetes.io/projected/b4905be6-774a-4952-b195-f755688c7b26-kube-api-access-frxjm\") pod \"ironic-operator-controller-manager-5b47c74dd5-skh8x\" (UID: \"b4905be6-774a-4952-b195-f755688c7b26\") " pod="openstack-operators/ironic-operator-controller-manager-5b47c74dd5-skh8x" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.904339 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2xkg\" (UniqueName: \"kubernetes.io/projected/552c1d94-e289-46e0-8756-58982a7cdc4c-kube-api-access-q2xkg\") pod \"manila-operator-controller-manager-598945d5b8-l8ggv\" (UID: \"552c1d94-e289-46e0-8756-58982a7cdc4c\") " pod="openstack-operators/manila-operator-controller-manager-598945d5b8-l8ggv" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.915709 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-746ccdd857-kkjhp"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.933503 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cd87b778f-4t295"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.934698 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-4t295" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.937401 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-t7qtf"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.938166 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-t7qtf" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.950785 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-2v4gs" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.950937 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-qtf2n" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.951006 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frxjm\" (UniqueName: \"kubernetes.io/projected/b4905be6-774a-4952-b195-f755688c7b26-kube-api-access-frxjm\") pod \"ironic-operator-controller-manager-5b47c74dd5-skh8x\" (UID: \"b4905be6-774a-4952-b195-f755688c7b26\") " pod="openstack-operators/ironic-operator-controller-manager-5b47c74dd5-skh8x" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.956610 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cd87b778f-4t295"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.967312 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-t7qtf"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.971726 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-68c649d9d-wv445"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.972010 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2xkg\" (UniqueName: \"kubernetes.io/projected/552c1d94-e289-46e0-8756-58982a7cdc4c-kube-api-access-q2xkg\") pod \"manila-operator-controller-manager-598945d5b8-l8ggv\" (UID: \"552c1d94-e289-46e0-8756-58982a7cdc4c\") " pod="openstack-operators/manila-operator-controller-manager-598945d5b8-l8ggv" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.972488 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-wv445" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.977895 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-tzvx2" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.980830 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-zz7v2"] Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.983177 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-zz7v2" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.984560 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-tnwlb" Jan 10 16:40:43 crc kubenswrapper[5036]: I0110 16:40:43.986799 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-68c649d9d-wv445"] Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.005267 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2skz\" (UniqueName: \"kubernetes.io/projected/6414be0b-ef34-4c95-9e31-4124dcad6cc4-kube-api-access-x2skz\") pod \"neutron-operator-controller-manager-7cd87b778f-4t295\" (UID: \"6414be0b-ef34-4c95-9e31-4124dcad6cc4\") " pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-4t295" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.005459 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdr69\" (UniqueName: \"kubernetes.io/projected/254c9f2b-ef77-4fbc-9884-c14caa297876-kube-api-access-tdr69\") pod \"octavia-operator-controller-manager-68c649d9d-wv445\" (UID: \"254c9f2b-ef77-4fbc-9884-c14caa297876\") " pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-wv445" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.005557 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tkx6\" (UniqueName: \"kubernetes.io/projected/0a3b9993-b2fb-4dda-952a-413cd5a3e01a-kube-api-access-9tkx6\") pod \"mariadb-operator-controller-manager-746ccdd857-kkjhp\" (UID: \"0a3b9993-b2fb-4dda-952a-413cd5a3e01a\") " pod="openstack-operators/mariadb-operator-controller-manager-746ccdd857-kkjhp" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.005648 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5knx\" (UniqueName: \"kubernetes.io/projected/506aa4ca-31bb-48da-94b5-9ab7b43aea96-kube-api-access-s5knx\") pod \"nova-operator-controller-manager-5fbbf8b6cc-t7qtf\" (UID: \"506aa4ca-31bb-48da-94b5-9ab7b43aea96\") " pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-t7qtf" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.008440 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-84587ffc8-l7b7s"] Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.009391 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-84587ffc8-l7b7s" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.019530 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-zz7v2"] Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.032108 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-npbcp" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.032219 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5b4889549f2j7sh"] Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.034553 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.036197 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.036962 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-fbd4h" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.050747 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-84587ffc8-l7b7s"] Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.055851 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-bb586bbf4-tn7cg"] Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.056923 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bb586bbf4-tn7cg" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.061240 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-568985c78-cs2b2" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.062160 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-qwwcj" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.063857 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-68d988df55-88zlb"] Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.064633 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-68d988df55-88zlb" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.066595 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-mphcn" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.083012 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-68d988df55-88zlb"] Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.088172 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bb586bbf4-tn7cg"] Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.105403 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-6c866cfdcb-nbdkb"] Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.106184 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-nbdkb" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.106537 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrfkd\" (UniqueName: \"kubernetes.io/projected/cf6aa765-9fbf-429d-83c1-db4671e7600c-kube-api-access-rrfkd\") pod \"ovn-operator-controller-manager-bf6d4f946-zz7v2\" (UID: \"cf6aa765-9fbf-429d-83c1-db4671e7600c\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-zz7v2" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.106666 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdr69\" (UniqueName: \"kubernetes.io/projected/254c9f2b-ef77-4fbc-9884-c14caa297876-kube-api-access-tdr69\") pod \"octavia-operator-controller-manager-68c649d9d-wv445\" (UID: \"254c9f2b-ef77-4fbc-9884-c14caa297876\") " pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-wv445" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.106803 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tkx6\" (UniqueName: \"kubernetes.io/projected/0a3b9993-b2fb-4dda-952a-413cd5a3e01a-kube-api-access-9tkx6\") pod \"mariadb-operator-controller-manager-746ccdd857-kkjhp\" (UID: \"0a3b9993-b2fb-4dda-952a-413cd5a3e01a\") " pod="openstack-operators/mariadb-operator-controller-manager-746ccdd857-kkjhp" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.106888 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjv6g\" (UniqueName: \"kubernetes.io/projected/f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1-kube-api-access-sjv6g\") pod \"openstack-baremetal-operator-controller-manager-5b4889549f2j7sh\" (UID: \"f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.106993 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5knx\" (UniqueName: \"kubernetes.io/projected/506aa4ca-31bb-48da-94b5-9ab7b43aea96-kube-api-access-s5knx\") pod \"nova-operator-controller-manager-5fbbf8b6cc-t7qtf\" (UID: \"506aa4ca-31bb-48da-94b5-9ab7b43aea96\") " pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-t7qtf" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.107090 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1-cert\") pod \"openstack-baremetal-operator-controller-manager-5b4889549f2j7sh\" (UID: \"f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.107212 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnqs5\" (UniqueName: \"kubernetes.io/projected/2e9ebb80-028a-43ac-b9cb-379dd1eda24e-kube-api-access-mnqs5\") pod \"swift-operator-controller-manager-bb586bbf4-tn7cg\" (UID: \"2e9ebb80-028a-43ac-b9cb-379dd1eda24e\") " pod="openstack-operators/swift-operator-controller-manager-bb586bbf4-tn7cg" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.107321 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2skz\" (UniqueName: \"kubernetes.io/projected/6414be0b-ef34-4c95-9e31-4124dcad6cc4-kube-api-access-x2skz\") pod \"neutron-operator-controller-manager-7cd87b778f-4t295\" (UID: \"6414be0b-ef34-4c95-9e31-4124dcad6cc4\") " pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-4t295" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.107427 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26vhr\" (UniqueName: \"kubernetes.io/projected/6283e4f6-c60e-4bff-b622-181c4abbc8a6-kube-api-access-26vhr\") pod \"placement-operator-controller-manager-84587ffc8-l7b7s\" (UID: \"6283e4f6-c60e-4bff-b622-181c4abbc8a6\") " pod="openstack-operators/placement-operator-controller-manager-84587ffc8-l7b7s" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.107521 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtqlw\" (UniqueName: \"kubernetes.io/projected/2c21d679-225e-4c33-8920-06a85ae163b6-kube-api-access-wtqlw\") pod \"telemetry-operator-controller-manager-68d988df55-88zlb\" (UID: \"2c21d679-225e-4c33-8920-06a85ae163b6\") " pod="openstack-operators/telemetry-operator-controller-manager-68d988df55-88zlb" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.107717 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-btzgz" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.121961 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5b4889549f2j7sh"] Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.122268 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5b47c74dd5-skh8x" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.130075 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6c866cfdcb-nbdkb"] Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.145493 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-9dbdf6486-fwmft"] Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.146368 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-9dbdf6486-fwmft" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.148202 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-tg4rj" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.152638 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5knx\" (UniqueName: \"kubernetes.io/projected/506aa4ca-31bb-48da-94b5-9ab7b43aea96-kube-api-access-s5knx\") pod \"nova-operator-controller-manager-5fbbf8b6cc-t7qtf\" (UID: \"506aa4ca-31bb-48da-94b5-9ab7b43aea96\") " pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-t7qtf" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.153432 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2skz\" (UniqueName: \"kubernetes.io/projected/6414be0b-ef34-4c95-9e31-4124dcad6cc4-kube-api-access-x2skz\") pod \"neutron-operator-controller-manager-7cd87b778f-4t295\" (UID: \"6414be0b-ef34-4c95-9e31-4124dcad6cc4\") " pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-4t295" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.153485 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tkx6\" (UniqueName: \"kubernetes.io/projected/0a3b9993-b2fb-4dda-952a-413cd5a3e01a-kube-api-access-9tkx6\") pod \"mariadb-operator-controller-manager-746ccdd857-kkjhp\" (UID: \"0a3b9993-b2fb-4dda-952a-413cd5a3e01a\") " pod="openstack-operators/mariadb-operator-controller-manager-746ccdd857-kkjhp" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.153629 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdr69\" (UniqueName: \"kubernetes.io/projected/254c9f2b-ef77-4fbc-9884-c14caa297876-kube-api-access-tdr69\") pod \"octavia-operator-controller-manager-68c649d9d-wv445\" (UID: \"254c9f2b-ef77-4fbc-9884-c14caa297876\") " pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-wv445" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.155509 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-9dbdf6486-fwmft"] Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.178923 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-wv445" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.189554 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-598945d5b8-l8ggv" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.191846 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn"] Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.192573 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.200973 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn"] Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.204210 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.204929 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.205140 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-fkb7b" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.211446 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjv6g\" (UniqueName: \"kubernetes.io/projected/f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1-kube-api-access-sjv6g\") pod \"openstack-baremetal-operator-controller-manager-5b4889549f2j7sh\" (UID: \"f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.211517 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1-cert\") pod \"openstack-baremetal-operator-controller-manager-5b4889549f2j7sh\" (UID: \"f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.211582 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnqs5\" (UniqueName: \"kubernetes.io/projected/2e9ebb80-028a-43ac-b9cb-379dd1eda24e-kube-api-access-mnqs5\") pod \"swift-operator-controller-manager-bb586bbf4-tn7cg\" (UID: \"2e9ebb80-028a-43ac-b9cb-379dd1eda24e\") " pod="openstack-operators/swift-operator-controller-manager-bb586bbf4-tn7cg" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.211626 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26vhr\" (UniqueName: \"kubernetes.io/projected/6283e4f6-c60e-4bff-b622-181c4abbc8a6-kube-api-access-26vhr\") pod \"placement-operator-controller-manager-84587ffc8-l7b7s\" (UID: \"6283e4f6-c60e-4bff-b622-181c4abbc8a6\") " pod="openstack-operators/placement-operator-controller-manager-84587ffc8-l7b7s" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.211654 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtqlw\" (UniqueName: \"kubernetes.io/projected/2c21d679-225e-4c33-8920-06a85ae163b6-kube-api-access-wtqlw\") pod \"telemetry-operator-controller-manager-68d988df55-88zlb\" (UID: \"2c21d679-225e-4c33-8920-06a85ae163b6\") " pod="openstack-operators/telemetry-operator-controller-manager-68d988df55-88zlb" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.212813 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgpn4\" (UniqueName: \"kubernetes.io/projected/7d8099e2-6cd1-4ce8-b78b-0b51a4fedf42-kube-api-access-bgpn4\") pod \"watcher-operator-controller-manager-9dbdf6486-fwmft\" (UID: \"7d8099e2-6cd1-4ce8-b78b-0b51a4fedf42\") " pod="openstack-operators/watcher-operator-controller-manager-9dbdf6486-fwmft" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.212863 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrfkd\" (UniqueName: \"kubernetes.io/projected/cf6aa765-9fbf-429d-83c1-db4671e7600c-kube-api-access-rrfkd\") pod \"ovn-operator-controller-manager-bf6d4f946-zz7v2\" (UID: \"cf6aa765-9fbf-429d-83c1-db4671e7600c\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-zz7v2" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.212971 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf9sz\" (UniqueName: \"kubernetes.io/projected/f3046ad8-aadd-4883-82b9-a794ddce82b9-kube-api-access-gf9sz\") pod \"test-operator-controller-manager-6c866cfdcb-nbdkb\" (UID: \"f3046ad8-aadd-4883-82b9-a794ddce82b9\") " pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-nbdkb" Jan 10 16:40:44 crc kubenswrapper[5036]: E0110 16:40:44.213596 5036 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 10 16:40:44 crc kubenswrapper[5036]: E0110 16:40:44.213646 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1-cert podName:f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1 nodeName:}" failed. No retries permitted until 2026-01-10 16:40:44.713631736 +0000 UTC m=+766.583867230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1-cert") pod "openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" (UID: "f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.236532 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjv6g\" (UniqueName: \"kubernetes.io/projected/f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1-kube-api-access-sjv6g\") pod \"openstack-baremetal-operator-controller-manager-5b4889549f2j7sh\" (UID: \"f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.277952 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnqs5\" (UniqueName: \"kubernetes.io/projected/2e9ebb80-028a-43ac-b9cb-379dd1eda24e-kube-api-access-mnqs5\") pod \"swift-operator-controller-manager-bb586bbf4-tn7cg\" (UID: \"2e9ebb80-028a-43ac-b9cb-379dd1eda24e\") " pod="openstack-operators/swift-operator-controller-manager-bb586bbf4-tn7cg" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.277976 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrfkd\" (UniqueName: \"kubernetes.io/projected/cf6aa765-9fbf-429d-83c1-db4671e7600c-kube-api-access-rrfkd\") pod \"ovn-operator-controller-manager-bf6d4f946-zz7v2\" (UID: \"cf6aa765-9fbf-429d-83c1-db4671e7600c\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-zz7v2" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.278479 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtqlw\" (UniqueName: \"kubernetes.io/projected/2c21d679-225e-4c33-8920-06a85ae163b6-kube-api-access-wtqlw\") pod \"telemetry-operator-controller-manager-68d988df55-88zlb\" (UID: \"2c21d679-225e-4c33-8920-06a85ae163b6\") " pod="openstack-operators/telemetry-operator-controller-manager-68d988df55-88zlb" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.279501 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26vhr\" (UniqueName: \"kubernetes.io/projected/6283e4f6-c60e-4bff-b622-181c4abbc8a6-kube-api-access-26vhr\") pod \"placement-operator-controller-manager-84587ffc8-l7b7s\" (UID: \"6283e4f6-c60e-4bff-b622-181c4abbc8a6\") " pod="openstack-operators/placement-operator-controller-manager-84587ffc8-l7b7s" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.280646 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcjhz"] Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.282342 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcjhz" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.287543 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-sfvn6" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.303658 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bb586bbf4-tn7cg" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.309267 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcjhz"] Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.313815 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgpn4\" (UniqueName: \"kubernetes.io/projected/7d8099e2-6cd1-4ce8-b78b-0b51a4fedf42-kube-api-access-bgpn4\") pod \"watcher-operator-controller-manager-9dbdf6486-fwmft\" (UID: \"7d8099e2-6cd1-4ce8-b78b-0b51a4fedf42\") " pod="openstack-operators/watcher-operator-controller-manager-9dbdf6486-fwmft" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.313862 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf9sz\" (UniqueName: \"kubernetes.io/projected/f3046ad8-aadd-4883-82b9-a794ddce82b9-kube-api-access-gf9sz\") pod \"test-operator-controller-manager-6c866cfdcb-nbdkb\" (UID: \"f3046ad8-aadd-4883-82b9-a794ddce82b9\") " pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-nbdkb" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.313899 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-metrics-certs\") pod \"openstack-operator-controller-manager-56458c9ddd-p4bsn\" (UID: \"de8e8f66-6d85-43d5-94a4-613fb3bfc53b\") " pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.313920 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-webhook-certs\") pod \"openstack-operator-controller-manager-56458c9ddd-p4bsn\" (UID: \"de8e8f66-6d85-43d5-94a4-613fb3bfc53b\") " pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.313961 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80ddf12b-ee61-4d6f-a3fb-ff9aded793d7-cert\") pod \"infra-operator-controller-manager-77c48c7859-xcmds\" (UID: \"80ddf12b-ee61-4d6f-a3fb-ff9aded793d7\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xcmds" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.314045 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqj2t\" (UniqueName: \"kubernetes.io/projected/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-kube-api-access-tqj2t\") pod \"openstack-operator-controller-manager-56458c9ddd-p4bsn\" (UID: \"de8e8f66-6d85-43d5-94a4-613fb3bfc53b\") " pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.314064 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7n6j\" (UniqueName: \"kubernetes.io/projected/4ddc3dbc-f7b1-4627-9740-9e2f5c0296fd-kube-api-access-s7n6j\") pod \"rabbitmq-cluster-operator-manager-668c99d594-gcjhz\" (UID: \"4ddc3dbc-f7b1-4627-9740-9e2f5c0296fd\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcjhz" Jan 10 16:40:44 crc kubenswrapper[5036]: E0110 16:40:44.314368 5036 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 10 16:40:44 crc kubenswrapper[5036]: E0110 16:40:44.314417 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80ddf12b-ee61-4d6f-a3fb-ff9aded793d7-cert podName:80ddf12b-ee61-4d6f-a3fb-ff9aded793d7 nodeName:}" failed. No retries permitted until 2026-01-10 16:40:45.314400139 +0000 UTC m=+767.184635633 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/80ddf12b-ee61-4d6f-a3fb-ff9aded793d7-cert") pod "infra-operator-controller-manager-77c48c7859-xcmds" (UID: "80ddf12b-ee61-4d6f-a3fb-ff9aded793d7") : secret "infra-operator-webhook-server-cert" not found Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.333491 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-746ccdd857-kkjhp" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.340803 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgpn4\" (UniqueName: \"kubernetes.io/projected/7d8099e2-6cd1-4ce8-b78b-0b51a4fedf42-kube-api-access-bgpn4\") pod \"watcher-operator-controller-manager-9dbdf6486-fwmft\" (UID: \"7d8099e2-6cd1-4ce8-b78b-0b51a4fedf42\") " pod="openstack-operators/watcher-operator-controller-manager-9dbdf6486-fwmft" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.341051 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf9sz\" (UniqueName: \"kubernetes.io/projected/f3046ad8-aadd-4883-82b9-a794ddce82b9-kube-api-access-gf9sz\") pod \"test-operator-controller-manager-6c866cfdcb-nbdkb\" (UID: \"f3046ad8-aadd-4883-82b9-a794ddce82b9\") " pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-nbdkb" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.350178 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-68d988df55-88zlb" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.365091 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-nbdkb" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.376017 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-9dbdf6486-fwmft" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.406742 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-t7qtf" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.415500 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqj2t\" (UniqueName: \"kubernetes.io/projected/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-kube-api-access-tqj2t\") pod \"openstack-operator-controller-manager-56458c9ddd-p4bsn\" (UID: \"de8e8f66-6d85-43d5-94a4-613fb3bfc53b\") " pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.415543 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7n6j\" (UniqueName: \"kubernetes.io/projected/4ddc3dbc-f7b1-4627-9740-9e2f5c0296fd-kube-api-access-s7n6j\") pod \"rabbitmq-cluster-operator-manager-668c99d594-gcjhz\" (UID: \"4ddc3dbc-f7b1-4627-9740-9e2f5c0296fd\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcjhz" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.415613 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-metrics-certs\") pod \"openstack-operator-controller-manager-56458c9ddd-p4bsn\" (UID: \"de8e8f66-6d85-43d5-94a4-613fb3bfc53b\") " pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.415630 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-webhook-certs\") pod \"openstack-operator-controller-manager-56458c9ddd-p4bsn\" (UID: \"de8e8f66-6d85-43d5-94a4-613fb3bfc53b\") " pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:40:44 crc kubenswrapper[5036]: E0110 16:40:44.415823 5036 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 10 16:40:44 crc kubenswrapper[5036]: E0110 16:40:44.415871 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-webhook-certs podName:de8e8f66-6d85-43d5-94a4-613fb3bfc53b nodeName:}" failed. No retries permitted until 2026-01-10 16:40:44.915855171 +0000 UTC m=+766.786090665 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-webhook-certs") pod "openstack-operator-controller-manager-56458c9ddd-p4bsn" (UID: "de8e8f66-6d85-43d5-94a4-613fb3bfc53b") : secret "webhook-server-cert" not found Jan 10 16:40:44 crc kubenswrapper[5036]: E0110 16:40:44.416237 5036 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 10 16:40:44 crc kubenswrapper[5036]: E0110 16:40:44.416267 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-metrics-certs podName:de8e8f66-6d85-43d5-94a4-613fb3bfc53b nodeName:}" failed. No retries permitted until 2026-01-10 16:40:44.916258023 +0000 UTC m=+766.786493517 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-metrics-certs") pod "openstack-operator-controller-manager-56458c9ddd-p4bsn" (UID: "de8e8f66-6d85-43d5-94a4-613fb3bfc53b") : secret "metrics-server-cert" not found Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.441410 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqj2t\" (UniqueName: \"kubernetes.io/projected/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-kube-api-access-tqj2t\") pod \"openstack-operator-controller-manager-56458c9ddd-p4bsn\" (UID: \"de8e8f66-6d85-43d5-94a4-613fb3bfc53b\") " pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.442207 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7n6j\" (UniqueName: \"kubernetes.io/projected/4ddc3dbc-f7b1-4627-9740-9e2f5c0296fd-kube-api-access-s7n6j\") pod \"rabbitmq-cluster-operator-manager-668c99d594-gcjhz\" (UID: \"4ddc3dbc-f7b1-4627-9740-9e2f5c0296fd\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcjhz" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.450617 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-4t295" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.462862 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-zz7v2" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.522799 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-84587ffc8-l7b7s" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.542589 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-78979fc445-2qq47"] Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.542634 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-678b8c6d96-568pc"] Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.650967 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66f8b87655-trzdf"] Jan 10 16:40:44 crc kubenswrapper[5036]: W0110 16:40:44.682999 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52b19fea_05ac_4448_9446_33fbee11b2da.slice/crio-a243b97b0827538139b8c0aa87d614fd47afa32f7ebfa79e16419301d1d85c35 WatchSource:0}: Error finding container a243b97b0827538139b8c0aa87d614fd47afa32f7ebfa79e16419301d1d85c35: Status 404 returned error can't find the container with id a243b97b0827538139b8c0aa87d614fd47afa32f7ebfa79e16419301d1d85c35 Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.709430 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcjhz" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.724002 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1-cert\") pod \"openstack-baremetal-operator-controller-manager-5b4889549f2j7sh\" (UID: \"f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" Jan 10 16:40:44 crc kubenswrapper[5036]: E0110 16:40:44.724180 5036 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 10 16:40:44 crc kubenswrapper[5036]: E0110 16:40:44.724232 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1-cert podName:f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1 nodeName:}" failed. No retries permitted until 2026-01-10 16:40:45.724214632 +0000 UTC m=+767.594450126 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1-cert") pod "openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" (UID: "f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.926820 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-metrics-certs\") pod \"openstack-operator-controller-manager-56458c9ddd-p4bsn\" (UID: \"de8e8f66-6d85-43d5-94a4-613fb3bfc53b\") " pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.926865 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-webhook-certs\") pod \"openstack-operator-controller-manager-56458c9ddd-p4bsn\" (UID: \"de8e8f66-6d85-43d5-94a4-613fb3bfc53b\") " pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:40:44 crc kubenswrapper[5036]: E0110 16:40:44.926988 5036 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 10 16:40:44 crc kubenswrapper[5036]: E0110 16:40:44.927042 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-webhook-certs podName:de8e8f66-6d85-43d5-94a4-613fb3bfc53b nodeName:}" failed. No retries permitted until 2026-01-10 16:40:45.927020153 +0000 UTC m=+767.797255647 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-webhook-certs") pod "openstack-operator-controller-manager-56458c9ddd-p4bsn" (UID: "de8e8f66-6d85-43d5-94a4-613fb3bfc53b") : secret "webhook-server-cert" not found Jan 10 16:40:44 crc kubenswrapper[5036]: E0110 16:40:44.927089 5036 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 10 16:40:44 crc kubenswrapper[5036]: E0110 16:40:44.927173 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-metrics-certs podName:de8e8f66-6d85-43d5-94a4-613fb3bfc53b nodeName:}" failed. No retries permitted until 2026-01-10 16:40:45.927148907 +0000 UTC m=+767.797384461 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-metrics-certs") pod "openstack-operator-controller-manager-56458c9ddd-p4bsn" (UID: "de8e8f66-6d85-43d5-94a4-613fb3bfc53b") : secret "metrics-server-cert" not found Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.953602 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-65c54c675d-ng9ld"] Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.964031 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5967c8645c-cbdjv"] Jan 10 16:40:44 crc kubenswrapper[5036]: I0110 16:40:44.994979 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7998b4cc7b-bjxnm"] Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.023881 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-568985c78-cs2b2"] Jan 10 16:40:45 crc kubenswrapper[5036]: W0110 16:40:45.038898 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod552c1d94_e289_46e0_8756_58982a7cdc4c.slice/crio-62431ed3a6132d84b6d606b101bb2bbf5dcd815b6bf005be290a6f6afbf72211 WatchSource:0}: Error finding container 62431ed3a6132d84b6d606b101bb2bbf5dcd815b6bf005be290a6f6afbf72211: Status 404 returned error can't find the container with id 62431ed3a6132d84b6d606b101bb2bbf5dcd815b6bf005be290a6f6afbf72211 Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.041452 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-598945d5b8-l8ggv"] Jan 10 16:40:45 crc kubenswrapper[5036]: W0110 16:40:45.049946 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4905be6_774a_4952_b195_f755688c7b26.slice/crio-197786bf93daba5964482028401afc3a3caaaeca4b0b8fa780778ab53c2014e6 WatchSource:0}: Error finding container 197786bf93daba5964482028401afc3a3caaaeca4b0b8fa780778ab53c2014e6: Status 404 returned error can't find the container with id 197786bf93daba5964482028401afc3a3caaaeca4b0b8fa780778ab53c2014e6 Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.057787 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5b47c74dd5-skh8x"] Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.060660 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7998b4cc7b-bjxnm" event={"ID":"58ba757d-493c-4a4c-9aaa-a3178272b7cb","Type":"ContainerStarted","Data":"c65f5b87d86473b9bce94a9500d3687b1d0aa22ff13a6ab30506aae709158fe1"} Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.063277 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-65c54c675d-ng9ld" event={"ID":"ecf84720-507a-4a26-8326-7ed56754871e","Type":"ContainerStarted","Data":"0244cc75b5c7626ac65f16ee24a4e00e7ec9f5343ce64e308bd65f224a7cb49b"} Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.063329 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-68c649d9d-wv445"] Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.067712 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-wv445" event={"ID":"254c9f2b-ef77-4fbc-9884-c14caa297876","Type":"ContainerStarted","Data":"c5c88ddcaed993fe671a895810ecfbdf5fd7bad94b0c7fcf2d4894b3aa831fd5"} Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.068443 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-78979fc445-2qq47" event={"ID":"f1b7f315-826c-4a66-9919-69b3c75a648e","Type":"ContainerStarted","Data":"2e853ed07801b469ab77d909b135a3ce6b3c24de07057f95c2f87efab41ab993"} Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.069150 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-678b8c6d96-568pc" event={"ID":"a17f3d4e-41a9-4941-83f6-090808b6cb29","Type":"ContainerStarted","Data":"76c02180c37708de2bcb76f8844c014de62158b85541d0b4230f1428aaed0204"} Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.070058 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-568985c78-cs2b2" event={"ID":"611b3f4f-0b6d-4ef9-b040-eba991c4bfe4","Type":"ContainerStarted","Data":"aae94d705f06c46d4b1b404a112c7a3867fa2398bb64503c54ce77ace7849b48"} Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.070783 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5967c8645c-cbdjv" event={"ID":"09239a1e-ce39-49e7-a532-f7c353022176","Type":"ContainerStarted","Data":"811f90d6e8f27cfede3c5252d8b249aeaabc12d77bcecd3956463b73ea91f963"} Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.071480 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-598945d5b8-l8ggv" event={"ID":"552c1d94-e289-46e0-8756-58982a7cdc4c","Type":"ContainerStarted","Data":"62431ed3a6132d84b6d606b101bb2bbf5dcd815b6bf005be290a6f6afbf72211"} Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.072190 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5b47c74dd5-skh8x" event={"ID":"b4905be6-774a-4952-b195-f755688c7b26","Type":"ContainerStarted","Data":"197786bf93daba5964482028401afc3a3caaaeca4b0b8fa780778ab53c2014e6"} Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.072904 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-trzdf" event={"ID":"52b19fea-05ac-4448-9446-33fbee11b2da","Type":"ContainerStarted","Data":"a243b97b0827538139b8c0aa87d614fd47afa32f7ebfa79e16419301d1d85c35"} Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.336155 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80ddf12b-ee61-4d6f-a3fb-ff9aded793d7-cert\") pod \"infra-operator-controller-manager-77c48c7859-xcmds\" (UID: \"80ddf12b-ee61-4d6f-a3fb-ff9aded793d7\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xcmds" Jan 10 16:40:45 crc kubenswrapper[5036]: E0110 16:40:45.336362 5036 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 10 16:40:45 crc kubenswrapper[5036]: E0110 16:40:45.336462 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80ddf12b-ee61-4d6f-a3fb-ff9aded793d7-cert podName:80ddf12b-ee61-4d6f-a3fb-ff9aded793d7 nodeName:}" failed. No retries permitted until 2026-01-10 16:40:47.336438134 +0000 UTC m=+769.206673688 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/80ddf12b-ee61-4d6f-a3fb-ff9aded793d7-cert") pod "infra-operator-controller-manager-77c48c7859-xcmds" (UID: "80ddf12b-ee61-4d6f-a3fb-ff9aded793d7") : secret "infra-operator-webhook-server-cert" not found Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.404855 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6c866cfdcb-nbdkb"] Jan 10 16:40:45 crc kubenswrapper[5036]: W0110 16:40:45.424479 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e9ebb80_028a_43ac_b9cb_379dd1eda24e.slice/crio-29d636f07dcaa99377bd1a0627036cef6bb9cd9c8e3ca24413c93bba6da78776 WatchSource:0}: Error finding container 29d636f07dcaa99377bd1a0627036cef6bb9cd9c8e3ca24413c93bba6da78776: Status 404 returned error can't find the container with id 29d636f07dcaa99377bd1a0627036cef6bb9cd9c8e3ca24413c93bba6da78776 Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.427077 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bb586bbf4-tn7cg"] Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.433417 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-zz7v2"] Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.445503 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-t7qtf"] Jan 10 16:40:45 crc kubenswrapper[5036]: W0110 16:40:45.452020 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod506aa4ca_31bb_48da_94b5_9ab7b43aea96.slice/crio-b4bddf38f282fdc5657496a183111e2e729e0b4aa1f33d3affdcf6445043a3a8 WatchSource:0}: Error finding container b4bddf38f282fdc5657496a183111e2e729e0b4aa1f33d3affdcf6445043a3a8: Status 404 returned error can't find the container with id b4bddf38f282fdc5657496a183111e2e729e0b4aa1f33d3affdcf6445043a3a8 Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.473132 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-746ccdd857-kkjhp"] Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.479590 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-68d988df55-88zlb"] Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.496743 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-84587ffc8-l7b7s"] Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.499403 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-9dbdf6486-fwmft"] Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.505883 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cd87b778f-4t295"] Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.516704 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcjhz"] Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.752888 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1-cert\") pod \"openstack-baremetal-operator-controller-manager-5b4889549f2j7sh\" (UID: \"f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" Jan 10 16:40:45 crc kubenswrapper[5036]: E0110 16:40:45.753083 5036 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 10 16:40:45 crc kubenswrapper[5036]: E0110 16:40:45.753290 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1-cert podName:f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1 nodeName:}" failed. No retries permitted until 2026-01-10 16:40:47.753272697 +0000 UTC m=+769.623508201 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1-cert") pod "openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" (UID: "f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.957797 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-metrics-certs\") pod \"openstack-operator-controller-manager-56458c9ddd-p4bsn\" (UID: \"de8e8f66-6d85-43d5-94a4-613fb3bfc53b\") " pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:40:45 crc kubenswrapper[5036]: I0110 16:40:45.957843 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-webhook-certs\") pod \"openstack-operator-controller-manager-56458c9ddd-p4bsn\" (UID: \"de8e8f66-6d85-43d5-94a4-613fb3bfc53b\") " pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:40:45 crc kubenswrapper[5036]: E0110 16:40:45.958018 5036 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 10 16:40:45 crc kubenswrapper[5036]: E0110 16:40:45.958082 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-webhook-certs podName:de8e8f66-6d85-43d5-94a4-613fb3bfc53b nodeName:}" failed. No retries permitted until 2026-01-10 16:40:47.958065725 +0000 UTC m=+769.828301219 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-webhook-certs") pod "openstack-operator-controller-manager-56458c9ddd-p4bsn" (UID: "de8e8f66-6d85-43d5-94a4-613fb3bfc53b") : secret "webhook-server-cert" not found Jan 10 16:40:45 crc kubenswrapper[5036]: E0110 16:40:45.958380 5036 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 10 16:40:45 crc kubenswrapper[5036]: E0110 16:40:45.958410 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-metrics-certs podName:de8e8f66-6d85-43d5-94a4-613fb3bfc53b nodeName:}" failed. No retries permitted until 2026-01-10 16:40:47.958403094 +0000 UTC m=+769.828638588 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-metrics-certs") pod "openstack-operator-controller-manager-56458c9ddd-p4bsn" (UID: "de8e8f66-6d85-43d5-94a4-613fb3bfc53b") : secret "metrics-server-cert" not found Jan 10 16:40:46 crc kubenswrapper[5036]: E0110 16:40:46.062362 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x2skz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-7cd87b778f-4t295_openstack-operators(6414be0b-ef34-4c95-9e31-4124dcad6cc4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 10 16:40:46 crc kubenswrapper[5036]: E0110 16:40:46.067264 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-4t295" podUID="6414be0b-ef34-4c95-9e31-4124dcad6cc4" Jan 10 16:40:46 crc kubenswrapper[5036]: I0110 16:40:46.083616 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-84587ffc8-l7b7s" event={"ID":"6283e4f6-c60e-4bff-b622-181c4abbc8a6","Type":"ContainerStarted","Data":"e7a49ebc5bd69d6d432d0db73d23062ae1acb2678d367412c2b47c1d4f6e7950"} Jan 10 16:40:46 crc kubenswrapper[5036]: I0110 16:40:46.084822 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-t7qtf" event={"ID":"506aa4ca-31bb-48da-94b5-9ab7b43aea96","Type":"ContainerStarted","Data":"b4bddf38f282fdc5657496a183111e2e729e0b4aa1f33d3affdcf6445043a3a8"} Jan 10 16:40:46 crc kubenswrapper[5036]: I0110 16:40:46.087436 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bb586bbf4-tn7cg" event={"ID":"2e9ebb80-028a-43ac-b9cb-379dd1eda24e","Type":"ContainerStarted","Data":"29d636f07dcaa99377bd1a0627036cef6bb9cd9c8e3ca24413c93bba6da78776"} Jan 10 16:40:46 crc kubenswrapper[5036]: E0110 16:40:46.094884 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:3c1b2858c64110448d801905fbbf3ffe7f78d264cc46ab12ab2d724842dba309,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wtqlw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-68d988df55-88zlb_openstack-operators(2c21d679-225e-4c33-8920-06a85ae163b6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 10 16:40:46 crc kubenswrapper[5036]: E0110 16:40:46.096346 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-68d988df55-88zlb" podUID="2c21d679-225e-4c33-8920-06a85ae163b6" Jan 10 16:40:46 crc kubenswrapper[5036]: I0110 16:40:46.099844 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-4t295" event={"ID":"6414be0b-ef34-4c95-9e31-4124dcad6cc4","Type":"ContainerStarted","Data":"d523963abb46b9ac90ae6969cd80b36a8b9268e114e299d501147885c9b4e3c8"} Jan 10 16:40:46 crc kubenswrapper[5036]: E0110 16:40:46.101528 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-4t295" podUID="6414be0b-ef34-4c95-9e31-4124dcad6cc4" Jan 10 16:40:46 crc kubenswrapper[5036]: W0110 16:40:46.103344 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ddc3dbc_f7b1_4627_9740_9e2f5c0296fd.slice/crio-810e30bc678c90566cca0325c5893665b631b78cf25cbbea327d4243fff5fbf0 WatchSource:0}: Error finding container 810e30bc678c90566cca0325c5893665b631b78cf25cbbea327d4243fff5fbf0: Status 404 returned error can't find the container with id 810e30bc678c90566cca0325c5893665b631b78cf25cbbea327d4243fff5fbf0 Jan 10 16:40:46 crc kubenswrapper[5036]: I0110 16:40:46.103546 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-746ccdd857-kkjhp" event={"ID":"0a3b9993-b2fb-4dda-952a-413cd5a3e01a","Type":"ContainerStarted","Data":"6508d220daf050140d4f9ba1bda6110514610ccc794e6bb33ca442c80a0d9756"} Jan 10 16:40:46 crc kubenswrapper[5036]: I0110 16:40:46.107600 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-9dbdf6486-fwmft" event={"ID":"7d8099e2-6cd1-4ce8-b78b-0b51a4fedf42","Type":"ContainerStarted","Data":"ba0e3245273fe24f6cdd67709caebc4aea1c566636c870909df60ed6d1f078f7"} Jan 10 16:40:46 crc kubenswrapper[5036]: E0110 16:40:46.112448 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s7n6j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-gcjhz_openstack-operators(4ddc3dbc-f7b1-4627-9740-9e2f5c0296fd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 10 16:40:46 crc kubenswrapper[5036]: E0110 16:40:46.114057 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcjhz" podUID="4ddc3dbc-f7b1-4627-9740-9e2f5c0296fd" Jan 10 16:40:46 crc kubenswrapper[5036]: I0110 16:40:46.118381 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-zz7v2" event={"ID":"cf6aa765-9fbf-429d-83c1-db4671e7600c","Type":"ContainerStarted","Data":"445e26a06f25689ad8ebb1cafb993f99d726615c3663b63eeeaf4e7cc48c54a6"} Jan 10 16:40:46 crc kubenswrapper[5036]: I0110 16:40:46.121694 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-nbdkb" event={"ID":"f3046ad8-aadd-4883-82b9-a794ddce82b9","Type":"ContainerStarted","Data":"6455afa7ae1a157c06237688734e01129da59ac4e86bf7aa2904f1af2d92455d"} Jan 10 16:40:47 crc kubenswrapper[5036]: I0110 16:40:47.140033 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcjhz" event={"ID":"4ddc3dbc-f7b1-4627-9740-9e2f5c0296fd","Type":"ContainerStarted","Data":"810e30bc678c90566cca0325c5893665b631b78cf25cbbea327d4243fff5fbf0"} Jan 10 16:40:47 crc kubenswrapper[5036]: E0110 16:40:47.141250 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcjhz" podUID="4ddc3dbc-f7b1-4627-9740-9e2f5c0296fd" Jan 10 16:40:47 crc kubenswrapper[5036]: I0110 16:40:47.143686 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-68d988df55-88zlb" event={"ID":"2c21d679-225e-4c33-8920-06a85ae163b6","Type":"ContainerStarted","Data":"88407c7882cbdc98905702ed932e024fccad3fa4302b4282620bba0870b836b9"} Jan 10 16:40:47 crc kubenswrapper[5036]: E0110 16:40:47.145325 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-4t295" podUID="6414be0b-ef34-4c95-9e31-4124dcad6cc4" Jan 10 16:40:47 crc kubenswrapper[5036]: E0110 16:40:47.169373 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:3c1b2858c64110448d801905fbbf3ffe7f78d264cc46ab12ab2d724842dba309\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-68d988df55-88zlb" podUID="2c21d679-225e-4c33-8920-06a85ae163b6" Jan 10 16:40:47 crc kubenswrapper[5036]: I0110 16:40:47.381441 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80ddf12b-ee61-4d6f-a3fb-ff9aded793d7-cert\") pod \"infra-operator-controller-manager-77c48c7859-xcmds\" (UID: \"80ddf12b-ee61-4d6f-a3fb-ff9aded793d7\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xcmds" Jan 10 16:40:47 crc kubenswrapper[5036]: E0110 16:40:47.381781 5036 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 10 16:40:47 crc kubenswrapper[5036]: E0110 16:40:47.381861 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80ddf12b-ee61-4d6f-a3fb-ff9aded793d7-cert podName:80ddf12b-ee61-4d6f-a3fb-ff9aded793d7 nodeName:}" failed. No retries permitted until 2026-01-10 16:40:51.381826054 +0000 UTC m=+773.252061548 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/80ddf12b-ee61-4d6f-a3fb-ff9aded793d7-cert") pod "infra-operator-controller-manager-77c48c7859-xcmds" (UID: "80ddf12b-ee61-4d6f-a3fb-ff9aded793d7") : secret "infra-operator-webhook-server-cert" not found Jan 10 16:40:47 crc kubenswrapper[5036]: I0110 16:40:47.788589 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1-cert\") pod \"openstack-baremetal-operator-controller-manager-5b4889549f2j7sh\" (UID: \"f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" Jan 10 16:40:47 crc kubenswrapper[5036]: E0110 16:40:47.788958 5036 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 10 16:40:47 crc kubenswrapper[5036]: E0110 16:40:47.789044 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1-cert podName:f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1 nodeName:}" failed. No retries permitted until 2026-01-10 16:40:51.789020713 +0000 UTC m=+773.659256207 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1-cert") pod "openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" (UID: "f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 10 16:40:47 crc kubenswrapper[5036]: I0110 16:40:47.992867 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-metrics-certs\") pod \"openstack-operator-controller-manager-56458c9ddd-p4bsn\" (UID: \"de8e8f66-6d85-43d5-94a4-613fb3bfc53b\") " pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:40:47 crc kubenswrapper[5036]: I0110 16:40:47.993207 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-webhook-certs\") pod \"openstack-operator-controller-manager-56458c9ddd-p4bsn\" (UID: \"de8e8f66-6d85-43d5-94a4-613fb3bfc53b\") " pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:40:47 crc kubenswrapper[5036]: E0110 16:40:47.993363 5036 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 10 16:40:47 crc kubenswrapper[5036]: E0110 16:40:47.993434 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-webhook-certs podName:de8e8f66-6d85-43d5-94a4-613fb3bfc53b nodeName:}" failed. No retries permitted until 2026-01-10 16:40:51.99341938 +0000 UTC m=+773.863654874 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-webhook-certs") pod "openstack-operator-controller-manager-56458c9ddd-p4bsn" (UID: "de8e8f66-6d85-43d5-94a4-613fb3bfc53b") : secret "webhook-server-cert" not found Jan 10 16:40:47 crc kubenswrapper[5036]: E0110 16:40:47.993793 5036 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 10 16:40:47 crc kubenswrapper[5036]: E0110 16:40:47.993948 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-metrics-certs podName:de8e8f66-6d85-43d5-94a4-613fb3bfc53b nodeName:}" failed. No retries permitted until 2026-01-10 16:40:51.993823201 +0000 UTC m=+773.864058685 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-metrics-certs") pod "openstack-operator-controller-manager-56458c9ddd-p4bsn" (UID: "de8e8f66-6d85-43d5-94a4-613fb3bfc53b") : secret "metrics-server-cert" not found Jan 10 16:40:48 crc kubenswrapper[5036]: E0110 16:40:48.170872 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcjhz" podUID="4ddc3dbc-f7b1-4627-9740-9e2f5c0296fd" Jan 10 16:40:48 crc kubenswrapper[5036]: E0110 16:40:48.171318 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:3c1b2858c64110448d801905fbbf3ffe7f78d264cc46ab12ab2d724842dba309\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-68d988df55-88zlb" podUID="2c21d679-225e-4c33-8920-06a85ae163b6" Jan 10 16:40:51 crc kubenswrapper[5036]: I0110 16:40:51.449866 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80ddf12b-ee61-4d6f-a3fb-ff9aded793d7-cert\") pod \"infra-operator-controller-manager-77c48c7859-xcmds\" (UID: \"80ddf12b-ee61-4d6f-a3fb-ff9aded793d7\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xcmds" Jan 10 16:40:51 crc kubenswrapper[5036]: E0110 16:40:51.450040 5036 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 10 16:40:51 crc kubenswrapper[5036]: E0110 16:40:51.450106 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80ddf12b-ee61-4d6f-a3fb-ff9aded793d7-cert podName:80ddf12b-ee61-4d6f-a3fb-ff9aded793d7 nodeName:}" failed. No retries permitted until 2026-01-10 16:40:59.450088094 +0000 UTC m=+781.320323588 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/80ddf12b-ee61-4d6f-a3fb-ff9aded793d7-cert") pod "infra-operator-controller-manager-77c48c7859-xcmds" (UID: "80ddf12b-ee61-4d6f-a3fb-ff9aded793d7") : secret "infra-operator-webhook-server-cert" not found Jan 10 16:40:51 crc kubenswrapper[5036]: I0110 16:40:51.854888 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1-cert\") pod \"openstack-baremetal-operator-controller-manager-5b4889549f2j7sh\" (UID: \"f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" Jan 10 16:40:51 crc kubenswrapper[5036]: E0110 16:40:51.855061 5036 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 10 16:40:51 crc kubenswrapper[5036]: E0110 16:40:51.855149 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1-cert podName:f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1 nodeName:}" failed. No retries permitted until 2026-01-10 16:40:59.855127621 +0000 UTC m=+781.725363115 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1-cert") pod "openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" (UID: "f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 10 16:40:52 crc kubenswrapper[5036]: I0110 16:40:52.058036 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-metrics-certs\") pod \"openstack-operator-controller-manager-56458c9ddd-p4bsn\" (UID: \"de8e8f66-6d85-43d5-94a4-613fb3bfc53b\") " pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:40:52 crc kubenswrapper[5036]: I0110 16:40:52.058084 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-webhook-certs\") pod \"openstack-operator-controller-manager-56458c9ddd-p4bsn\" (UID: \"de8e8f66-6d85-43d5-94a4-613fb3bfc53b\") " pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:40:52 crc kubenswrapper[5036]: E0110 16:40:52.058203 5036 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 10 16:40:52 crc kubenswrapper[5036]: E0110 16:40:52.058248 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-webhook-certs podName:de8e8f66-6d85-43d5-94a4-613fb3bfc53b nodeName:}" failed. No retries permitted until 2026-01-10 16:41:00.058234601 +0000 UTC m=+781.928470095 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-webhook-certs") pod "openstack-operator-controller-manager-56458c9ddd-p4bsn" (UID: "de8e8f66-6d85-43d5-94a4-613fb3bfc53b") : secret "webhook-server-cert" not found Jan 10 16:40:52 crc kubenswrapper[5036]: E0110 16:40:52.058977 5036 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 10 16:40:52 crc kubenswrapper[5036]: E0110 16:40:52.059091 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-metrics-certs podName:de8e8f66-6d85-43d5-94a4-613fb3bfc53b nodeName:}" failed. No retries permitted until 2026-01-10 16:41:00.059067445 +0000 UTC m=+781.929302959 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-metrics-certs") pod "openstack-operator-controller-manager-56458c9ddd-p4bsn" (UID: "de8e8f66-6d85-43d5-94a4-613fb3bfc53b") : secret "metrics-server-cert" not found Jan 10 16:40:55 crc kubenswrapper[5036]: I0110 16:40:55.903755 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 16:40:55 crc kubenswrapper[5036]: I0110 16:40:55.904119 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 16:40:58 crc kubenswrapper[5036]: E0110 16:40:58.936315 5036 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:174acf70c084144827fb8f96c5401a0a8def953bf0ff8929dccd629a550491b7" Jan 10 16:40:58 crc kubenswrapper[5036]: E0110 16:40:58.937062 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:174acf70c084144827fb8f96c5401a0a8def953bf0ff8929dccd629a550491b7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hj892,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-78979fc445-2qq47_openstack-operators(f1b7f315-826c-4a66-9919-69b3c75a648e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 10 16:40:58 crc kubenswrapper[5036]: E0110 16:40:58.938263 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-78979fc445-2qq47" podUID="f1b7f315-826c-4a66-9919-69b3c75a648e" Jan 10 16:40:59 crc kubenswrapper[5036]: E0110 16:40:59.254348 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:174acf70c084144827fb8f96c5401a0a8def953bf0ff8929dccd629a550491b7\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-78979fc445-2qq47" podUID="f1b7f315-826c-4a66-9919-69b3c75a648e" Jan 10 16:40:59 crc kubenswrapper[5036]: I0110 16:40:59.496788 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80ddf12b-ee61-4d6f-a3fb-ff9aded793d7-cert\") pod \"infra-operator-controller-manager-77c48c7859-xcmds\" (UID: \"80ddf12b-ee61-4d6f-a3fb-ff9aded793d7\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xcmds" Jan 10 16:40:59 crc kubenswrapper[5036]: I0110 16:40:59.506138 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80ddf12b-ee61-4d6f-a3fb-ff9aded793d7-cert\") pod \"infra-operator-controller-manager-77c48c7859-xcmds\" (UID: \"80ddf12b-ee61-4d6f-a3fb-ff9aded793d7\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xcmds" Jan 10 16:40:59 crc kubenswrapper[5036]: I0110 16:40:59.641192 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5tc7m" Jan 10 16:40:59 crc kubenswrapper[5036]: I0110 16:40:59.649421 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xcmds" Jan 10 16:40:59 crc kubenswrapper[5036]: E0110 16:40:59.760285 5036 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:9e8e9b030deea84e3da4fef0e6782f427c153a53f30f4332cf0ae3f9e16eeb5e" Jan 10 16:40:59 crc kubenswrapper[5036]: E0110 16:40:59.760587 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:9e8e9b030deea84e3da4fef0e6782f427c153a53f30f4332cf0ae3f9e16eeb5e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-26vhr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-84587ffc8-l7b7s_openstack-operators(6283e4f6-c60e-4bff-b622-181c4abbc8a6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 10 16:40:59 crc kubenswrapper[5036]: E0110 16:40:59.761837 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-84587ffc8-l7b7s" podUID="6283e4f6-c60e-4bff-b622-181c4abbc8a6" Jan 10 16:40:59 crc kubenswrapper[5036]: I0110 16:40:59.903106 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1-cert\") pod \"openstack-baremetal-operator-controller-manager-5b4889549f2j7sh\" (UID: \"f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" Jan 10 16:40:59 crc kubenswrapper[5036]: E0110 16:40:59.903269 5036 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 10 16:40:59 crc kubenswrapper[5036]: E0110 16:40:59.903323 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1-cert podName:f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1 nodeName:}" failed. No retries permitted until 2026-01-10 16:41:15.903307559 +0000 UTC m=+797.773543063 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1-cert") pod "openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" (UID: "f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 10 16:41:00 crc kubenswrapper[5036]: I0110 16:41:00.108791 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-metrics-certs\") pod \"openstack-operator-controller-manager-56458c9ddd-p4bsn\" (UID: \"de8e8f66-6d85-43d5-94a4-613fb3bfc53b\") " pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:41:00 crc kubenswrapper[5036]: I0110 16:41:00.108841 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-webhook-certs\") pod \"openstack-operator-controller-manager-56458c9ddd-p4bsn\" (UID: \"de8e8f66-6d85-43d5-94a4-613fb3bfc53b\") " pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:41:00 crc kubenswrapper[5036]: E0110 16:41:00.108986 5036 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 10 16:41:00 crc kubenswrapper[5036]: E0110 16:41:00.109061 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-metrics-certs podName:de8e8f66-6d85-43d5-94a4-613fb3bfc53b nodeName:}" failed. No retries permitted until 2026-01-10 16:41:16.109041821 +0000 UTC m=+797.979277315 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-metrics-certs") pod "openstack-operator-controller-manager-56458c9ddd-p4bsn" (UID: "de8e8f66-6d85-43d5-94a4-613fb3bfc53b") : secret "metrics-server-cert" not found Jan 10 16:41:00 crc kubenswrapper[5036]: E0110 16:41:00.108993 5036 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 10 16:41:00 crc kubenswrapper[5036]: E0110 16:41:00.109846 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-webhook-certs podName:de8e8f66-6d85-43d5-94a4-613fb3bfc53b nodeName:}" failed. No retries permitted until 2026-01-10 16:41:16.109827744 +0000 UTC m=+797.980063238 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-webhook-certs") pod "openstack-operator-controller-manager-56458c9ddd-p4bsn" (UID: "de8e8f66-6d85-43d5-94a4-613fb3bfc53b") : secret "webhook-server-cert" not found Jan 10 16:41:00 crc kubenswrapper[5036]: E0110 16:41:00.261895 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:9e8e9b030deea84e3da4fef0e6782f427c153a53f30f4332cf0ae3f9e16eeb5e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-84587ffc8-l7b7s" podUID="6283e4f6-c60e-4bff-b622-181c4abbc8a6" Jan 10 16:41:00 crc kubenswrapper[5036]: E0110 16:41:00.347607 5036 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:d07b15d4f65496b9a5dccfded18d4d5812b0a86d76b6350ad6dfdb0970163134" Jan 10 16:41:00 crc kubenswrapper[5036]: E0110 16:41:00.347818 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:d07b15d4f65496b9a5dccfded18d4d5812b0a86d76b6350ad6dfdb0970163134,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5p55t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-678b8c6d96-568pc_openstack-operators(a17f3d4e-41a9-4941-83f6-090808b6cb29): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 10 16:41:00 crc kubenswrapper[5036]: E0110 16:41:00.350096 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-678b8c6d96-568pc" podUID="a17f3d4e-41a9-4941-83f6-090808b6cb29" Jan 10 16:41:01 crc kubenswrapper[5036]: E0110 16:41:01.267917 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:d07b15d4f65496b9a5dccfded18d4d5812b0a86d76b6350ad6dfdb0970163134\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-678b8c6d96-568pc" podUID="a17f3d4e-41a9-4941-83f6-090808b6cb29" Jan 10 16:41:06 crc kubenswrapper[5036]: E0110 16:41:06.493695 5036 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.73:5001/openstack-k8s-operators/glance-operator:d1afba5d682ba7ec6c5c10756cb9a769b2e805da" Jan 10 16:41:06 crc kubenswrapper[5036]: E0110 16:41:06.494076 5036 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.73:5001/openstack-k8s-operators/glance-operator:d1afba5d682ba7ec6c5c10756cb9a769b2e805da" Jan 10 16:41:06 crc kubenswrapper[5036]: E0110 16:41:06.494250 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.73:5001/openstack-k8s-operators/glance-operator:d1afba5d682ba7ec6c5c10756cb9a769b2e805da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q2vzz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-5967c8645c-cbdjv_openstack-operators(09239a1e-ce39-49e7-a532-f7c353022176): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 10 16:41:06 crc kubenswrapper[5036]: E0110 16:41:06.495447 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-5967c8645c-cbdjv" podUID="09239a1e-ce39-49e7-a532-f7c353022176" Jan 10 16:41:07 crc kubenswrapper[5036]: E0110 16:41:07.039838 5036 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:879d3d679b58ae84419b7907ad092ad4d24bcc9222ce621ce464fd0fea347b0c" Jan 10 16:41:07 crc kubenswrapper[5036]: E0110 16:41:07.040064 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:879d3d679b58ae84419b7907ad092ad4d24bcc9222ce621ce464fd0fea347b0c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l2ls7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-568985c78-cs2b2_openstack-operators(611b3f4f-0b6d-4ef9-b040-eba991c4bfe4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 10 16:41:07 crc kubenswrapper[5036]: E0110 16:41:07.041260 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-568985c78-cs2b2" podUID="611b3f4f-0b6d-4ef9-b040-eba991c4bfe4" Jan 10 16:41:07 crc kubenswrapper[5036]: E0110 16:41:07.313884 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.73:5001/openstack-k8s-operators/glance-operator:d1afba5d682ba7ec6c5c10756cb9a769b2e805da\\\"\"" pod="openstack-operators/glance-operator-controller-manager-5967c8645c-cbdjv" podUID="09239a1e-ce39-49e7-a532-f7c353022176" Jan 10 16:41:07 crc kubenswrapper[5036]: E0110 16:41:07.315910 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:879d3d679b58ae84419b7907ad092ad4d24bcc9222ce621ce464fd0fea347b0c\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-568985c78-cs2b2" podUID="611b3f4f-0b6d-4ef9-b040-eba991c4bfe4" Jan 10 16:41:07 crc kubenswrapper[5036]: E0110 16:41:07.639143 5036 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Jan 10 16:41:07 crc kubenswrapper[5036]: E0110 16:41:07.639311 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s5knx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5fbbf8b6cc-t7qtf_openstack-operators(506aa4ca-31bb-48da-94b5-9ab7b43aea96): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 10 16:41:07 crc kubenswrapper[5036]: E0110 16:41:07.640638 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-t7qtf" podUID="506aa4ca-31bb-48da-94b5-9ab7b43aea96" Jan 10 16:41:08 crc kubenswrapper[5036]: E0110 16:41:08.321944 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-t7qtf" podUID="506aa4ca-31bb-48da-94b5-9ab7b43aea96" Jan 10 16:41:15 crc kubenswrapper[5036]: I0110 16:41:15.995654 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1-cert\") pod \"openstack-baremetal-operator-controller-manager-5b4889549f2j7sh\" (UID: \"f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" Jan 10 16:41:16 crc kubenswrapper[5036]: I0110 16:41:16.002894 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1-cert\") pod \"openstack-baremetal-operator-controller-manager-5b4889549f2j7sh\" (UID: \"f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" Jan 10 16:41:16 crc kubenswrapper[5036]: I0110 16:41:16.093119 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-fbd4h" Jan 10 16:41:16 crc kubenswrapper[5036]: I0110 16:41:16.100762 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" Jan 10 16:41:16 crc kubenswrapper[5036]: I0110 16:41:16.199881 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-metrics-certs\") pod \"openstack-operator-controller-manager-56458c9ddd-p4bsn\" (UID: \"de8e8f66-6d85-43d5-94a4-613fb3bfc53b\") " pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:41:16 crc kubenswrapper[5036]: I0110 16:41:16.199967 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-webhook-certs\") pod \"openstack-operator-controller-manager-56458c9ddd-p4bsn\" (UID: \"de8e8f66-6d85-43d5-94a4-613fb3bfc53b\") " pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:41:16 crc kubenswrapper[5036]: I0110 16:41:16.205707 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-metrics-certs\") pod \"openstack-operator-controller-manager-56458c9ddd-p4bsn\" (UID: \"de8e8f66-6d85-43d5-94a4-613fb3bfc53b\") " pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:41:16 crc kubenswrapper[5036]: I0110 16:41:16.206352 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/de8e8f66-6d85-43d5-94a4-613fb3bfc53b-webhook-certs\") pod \"openstack-operator-controller-manager-56458c9ddd-p4bsn\" (UID: \"de8e8f66-6d85-43d5-94a4-613fb3bfc53b\") " pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:41:16 crc kubenswrapper[5036]: I0110 16:41:16.499745 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-fkb7b" Jan 10 16:41:16 crc kubenswrapper[5036]: I0110 16:41:16.508106 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:41:21 crc kubenswrapper[5036]: I0110 16:41:21.592793 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-xcmds"] Jan 10 16:41:21 crc kubenswrapper[5036]: W0110 16:41:21.847286 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80ddf12b_ee61_4d6f_a3fb_ff9aded793d7.slice/crio-4e246a53af074d002a4f957ee1d87c1f28deb3605b2c754016028c2862f6ac3c WatchSource:0}: Error finding container 4e246a53af074d002a4f957ee1d87c1f28deb3605b2c754016028c2862f6ac3c: Status 404 returned error can't find the container with id 4e246a53af074d002a4f957ee1d87c1f28deb3605b2c754016028c2862f6ac3c Jan 10 16:41:21 crc kubenswrapper[5036]: E0110 16:41:21.853037 5036 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Jan 10 16:41:21 crc kubenswrapper[5036]: E0110 16:41:21.853220 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s7n6j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-gcjhz_openstack-operators(4ddc3dbc-f7b1-4627-9740-9e2f5c0296fd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 10 16:41:21 crc kubenswrapper[5036]: E0110 16:41:21.854721 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcjhz" podUID="4ddc3dbc-f7b1-4627-9740-9e2f5c0296fd" Jan 10 16:41:22 crc kubenswrapper[5036]: I0110 16:41:22.419015 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn"] Jan 10 16:41:22 crc kubenswrapper[5036]: I0110 16:41:22.447140 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xcmds" event={"ID":"80ddf12b-ee61-4d6f-a3fb-ff9aded793d7","Type":"ContainerStarted","Data":"4e246a53af074d002a4f957ee1d87c1f28deb3605b2c754016028c2862f6ac3c"} Jan 10 16:41:22 crc kubenswrapper[5036]: I0110 16:41:22.463616 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7998b4cc7b-bjxnm" event={"ID":"58ba757d-493c-4a4c-9aaa-a3178272b7cb","Type":"ContainerStarted","Data":"c7525943e23a06baa40eb6f20361ac873f3901d5e01670c9a209068273cd84ff"} Jan 10 16:41:22 crc kubenswrapper[5036]: I0110 16:41:22.464456 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-7998b4cc7b-bjxnm" Jan 10 16:41:22 crc kubenswrapper[5036]: I0110 16:41:22.477531 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-65c54c675d-ng9ld" event={"ID":"ecf84720-507a-4a26-8326-7ed56754871e","Type":"ContainerStarted","Data":"4eebeee91a94da3e5c7af9a85f6f3068739021311b828df5a006000b5d3656e7"} Jan 10 16:41:22 crc kubenswrapper[5036]: I0110 16:41:22.477660 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-65c54c675d-ng9ld" Jan 10 16:41:22 crc kubenswrapper[5036]: I0110 16:41:22.488852 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bb586bbf4-tn7cg" event={"ID":"2e9ebb80-028a-43ac-b9cb-379dd1eda24e","Type":"ContainerStarted","Data":"1de2b7d4311c0634071ab0f9926c253bdeb23ee9ef77e7366a60ec140f92fb9b"} Jan 10 16:41:22 crc kubenswrapper[5036]: I0110 16:41:22.489482 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-bb586bbf4-tn7cg" Jan 10 16:41:22 crc kubenswrapper[5036]: I0110 16:41:22.493720 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-7998b4cc7b-bjxnm" podStartSLOduration=17.50370336 podStartE2EDuration="39.493706021s" podCreationTimestamp="2026-01-10 16:40:43 +0000 UTC" firstStartedPulling="2026-01-10 16:40:45.016875856 +0000 UTC m=+766.887111350" lastFinishedPulling="2026-01-10 16:41:07.006878517 +0000 UTC m=+788.877114011" observedRunningTime="2026-01-10 16:41:22.493047662 +0000 UTC m=+804.363283156" watchObservedRunningTime="2026-01-10 16:41:22.493706021 +0000 UTC m=+804.363941515" Jan 10 16:41:22 crc kubenswrapper[5036]: I0110 16:41:22.528939 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-65c54c675d-ng9ld" podStartSLOduration=17.486361836 podStartE2EDuration="39.528919659s" podCreationTimestamp="2026-01-10 16:40:43 +0000 UTC" firstStartedPulling="2026-01-10 16:40:44.963569382 +0000 UTC m=+766.833804876" lastFinishedPulling="2026-01-10 16:41:07.006127205 +0000 UTC m=+788.876362699" observedRunningTime="2026-01-10 16:41:22.51043351 +0000 UTC m=+804.380669004" watchObservedRunningTime="2026-01-10 16:41:22.528919659 +0000 UTC m=+804.399155163" Jan 10 16:41:22 crc kubenswrapper[5036]: I0110 16:41:22.532159 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5b4889549f2j7sh"] Jan 10 16:41:22 crc kubenswrapper[5036]: W0110 16:41:22.534987 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7c6aeaf_94ec_4558_8ec7_b4fd144a49b1.slice/crio-a14a91c389b62f3005ec383b217839bc887587db6f7a8809ff7e19f6f13575fc WatchSource:0}: Error finding container a14a91c389b62f3005ec383b217839bc887587db6f7a8809ff7e19f6f13575fc: Status 404 returned error can't find the container with id a14a91c389b62f3005ec383b217839bc887587db6f7a8809ff7e19f6f13575fc Jan 10 16:41:22 crc kubenswrapper[5036]: I0110 16:41:22.539364 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-bb586bbf4-tn7cg" podStartSLOduration=17.365717453 podStartE2EDuration="39.539348278s" podCreationTimestamp="2026-01-10 16:40:43 +0000 UTC" firstStartedPulling="2026-01-10 16:40:45.427660096 +0000 UTC m=+767.297895590" lastFinishedPulling="2026-01-10 16:41:07.601290921 +0000 UTC m=+789.471526415" observedRunningTime="2026-01-10 16:41:22.530111843 +0000 UTC m=+804.400347337" watchObservedRunningTime="2026-01-10 16:41:22.539348278 +0000 UTC m=+804.409583782" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.511121 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-zz7v2" event={"ID":"cf6aa765-9fbf-429d-83c1-db4671e7600c","Type":"ContainerStarted","Data":"3ed5f4a6da5147b44712e99cff80a19c0800d53a76ddebcda0d1e7e3ef6711d0"} Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.511378 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-zz7v2" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.518285 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-84587ffc8-l7b7s" event={"ID":"6283e4f6-c60e-4bff-b622-181c4abbc8a6","Type":"ContainerStarted","Data":"3748970b458cccd03eeed1717aeb70038f2170df9031a8497d2ff4df61f912fb"} Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.518534 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-84587ffc8-l7b7s" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.521930 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-4t295" event={"ID":"6414be0b-ef34-4c95-9e31-4124dcad6cc4","Type":"ContainerStarted","Data":"4e87776865fbb163dae7e23f24c302d45e34d9083ca559076979909e9afe7eca"} Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.522561 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-4t295" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.543858 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-746ccdd857-kkjhp" event={"ID":"0a3b9993-b2fb-4dda-952a-413cd5a3e01a","Type":"ContainerStarted","Data":"de66877d8ceaae5ee0a7fbf185a90498a6d3277d77cc2e5435641a53cdfc57ee"} Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.544455 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-746ccdd857-kkjhp" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.550606 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-zz7v2" podStartSLOduration=17.29073461 podStartE2EDuration="40.55059128s" podCreationTimestamp="2026-01-10 16:40:43 +0000 UTC" firstStartedPulling="2026-01-10 16:40:45.449409804 +0000 UTC m=+767.319645298" lastFinishedPulling="2026-01-10 16:41:08.709266474 +0000 UTC m=+790.579501968" observedRunningTime="2026-01-10 16:41:23.545255997 +0000 UTC m=+805.415491491" watchObservedRunningTime="2026-01-10 16:41:23.55059128 +0000 UTC m=+805.420826764" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.564912 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-wv445" event={"ID":"254c9f2b-ef77-4fbc-9884-c14caa297876","Type":"ContainerStarted","Data":"5ac0f16fef722b004189213aae13c2b2bc42a1b4ce8a4412675d64b1ec8ac0ce"} Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.565556 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-wv445" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.578775 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-746ccdd857-kkjhp" podStartSLOduration=19.035849471 podStartE2EDuration="40.578743996s" podCreationTimestamp="2026-01-10 16:40:43 +0000 UTC" firstStartedPulling="2026-01-10 16:40:46.057776968 +0000 UTC m=+767.928012462" lastFinishedPulling="2026-01-10 16:41:07.600671493 +0000 UTC m=+789.470906987" observedRunningTime="2026-01-10 16:41:23.574302799 +0000 UTC m=+805.444538293" watchObservedRunningTime="2026-01-10 16:41:23.578743996 +0000 UTC m=+805.448979490" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.583148 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-678b8c6d96-568pc" event={"ID":"a17f3d4e-41a9-4941-83f6-090808b6cb29","Type":"ContainerStarted","Data":"b9180629426db5b37b8d2fa566c94d505e199c7b8cc0fe5703760e111e606d63"} Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.583748 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-678b8c6d96-568pc" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.596948 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-trzdf" event={"ID":"52b19fea-05ac-4448-9446-33fbee11b2da","Type":"ContainerStarted","Data":"3074f81d0457638b74b859293b7b59a6419b911bec049054ef9f358e719a0266"} Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.596988 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-trzdf" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.623192 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-t7qtf" event={"ID":"506aa4ca-31bb-48da-94b5-9ab7b43aea96","Type":"ContainerStarted","Data":"d8cb33f17de9fa6fa39b84cdaddcd0396ba398c31bdf55360d604808b4fb357d"} Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.624877 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-t7qtf" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.653693 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-68d988df55-88zlb" event={"ID":"2c21d679-225e-4c33-8920-06a85ae163b6","Type":"ContainerStarted","Data":"bfc0347c51d2a8df6fbc7259e6c34214e30dea40ec6fb56e05af3b90c8364954"} Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.654386 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-68d988df55-88zlb" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.674195 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-9dbdf6486-fwmft" event={"ID":"7d8099e2-6cd1-4ce8-b78b-0b51a4fedf42","Type":"ContainerStarted","Data":"31f8c882e6c701a7ac638a46b446e818a918cdfec47733d4320e39f8cd35d943"} Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.674881 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-9dbdf6486-fwmft" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.676308 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-78979fc445-2qq47" event={"ID":"f1b7f315-826c-4a66-9919-69b3c75a648e","Type":"ContainerStarted","Data":"84a5ef38be8f4a6bd930c9cc9686562f7089d098ae492de54f719e337a4c90b9"} Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.676808 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-78979fc445-2qq47" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.677468 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" event={"ID":"f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1","Type":"ContainerStarted","Data":"a14a91c389b62f3005ec383b217839bc887587db6f7a8809ff7e19f6f13575fc"} Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.678393 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" event={"ID":"de8e8f66-6d85-43d5-94a4-613fb3bfc53b","Type":"ContainerStarted","Data":"9ec5f454b4f0720e0b231730f205166470efc46dd9b2332f1983697ac8841357"} Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.678417 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" event={"ID":"de8e8f66-6d85-43d5-94a4-613fb3bfc53b","Type":"ContainerStarted","Data":"bcec20af2833866628df6698e63ff7157e792dca359b1a7341ef13ef0e4857c8"} Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.678752 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.688992 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5967c8645c-cbdjv" event={"ID":"09239a1e-ce39-49e7-a532-f7c353022176","Type":"ContainerStarted","Data":"cf7557d48bc7c26dd623bf6bb93c56baedf1f0c2998adc7a16204f85d13b4cab"} Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.689608 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5967c8645c-cbdjv" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.690724 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-nbdkb" event={"ID":"f3046ad8-aadd-4883-82b9-a794ddce82b9","Type":"ContainerStarted","Data":"bdb308218510502e8838d2f49f455d5741319d6e6ad77cdb5ae2859bb0c9316a"} Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.690849 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-nbdkb" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.702014 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-568985c78-cs2b2" event={"ID":"611b3f4f-0b6d-4ef9-b040-eba991c4bfe4","Type":"ContainerStarted","Data":"46bfa191581106e54c938c077a06285a500a6acb8cb96248fbfa8e985486bcb3"} Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.702634 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-568985c78-cs2b2" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.723908 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-598945d5b8-l8ggv" event={"ID":"552c1d94-e289-46e0-8756-58982a7cdc4c","Type":"ContainerStarted","Data":"6fca1afbf0b3f26694d2de09b5137370a1bf667ff644c5fc6cd078281b7fc851"} Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.724525 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-598945d5b8-l8ggv" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.743038 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5b47c74dd5-skh8x" event={"ID":"b4905be6-774a-4952-b195-f755688c7b26","Type":"ContainerStarted","Data":"ff98cd60e3e4f0f5f36bc5620a1d5e180ced638cff2c36975eeaee86508e4e0c"} Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.743087 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5b47c74dd5-skh8x" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.785639 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-84587ffc8-l7b7s" podStartSLOduration=4.703394879 podStartE2EDuration="40.785621321s" podCreationTimestamp="2026-01-10 16:40:43 +0000 UTC" firstStartedPulling="2026-01-10 16:40:46.052985281 +0000 UTC m=+767.923220765" lastFinishedPulling="2026-01-10 16:41:22.135211713 +0000 UTC m=+804.005447207" observedRunningTime="2026-01-10 16:41:23.767602085 +0000 UTC m=+805.637837569" watchObservedRunningTime="2026-01-10 16:41:23.785621321 +0000 UTC m=+805.655856815" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.808342 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-4t295" podStartSLOduration=5.011294587 podStartE2EDuration="40.808325102s" podCreationTimestamp="2026-01-10 16:40:43 +0000 UTC" firstStartedPulling="2026-01-10 16:40:46.062177963 +0000 UTC m=+767.932413457" lastFinishedPulling="2026-01-10 16:41:21.859208468 +0000 UTC m=+803.729443972" observedRunningTime="2026-01-10 16:41:23.700898285 +0000 UTC m=+805.571133789" watchObservedRunningTime="2026-01-10 16:41:23.808325102 +0000 UTC m=+805.678560596" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.831614 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-598945d5b8-l8ggv" podStartSLOduration=18.273766729 podStartE2EDuration="40.831595918s" podCreationTimestamp="2026-01-10 16:40:43 +0000 UTC" firstStartedPulling="2026-01-10 16:40:45.040657762 +0000 UTC m=+766.910893256" lastFinishedPulling="2026-01-10 16:41:07.598486941 +0000 UTC m=+789.468722445" observedRunningTime="2026-01-10 16:41:23.829341574 +0000 UTC m=+805.699577068" watchObservedRunningTime="2026-01-10 16:41:23.831595918 +0000 UTC m=+805.701831412" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.916390 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-678b8c6d96-568pc" podStartSLOduration=3.4445491 podStartE2EDuration="40.916371066s" podCreationTimestamp="2026-01-10 16:40:43 +0000 UTC" firstStartedPulling="2026-01-10 16:40:44.602196175 +0000 UTC m=+766.472431669" lastFinishedPulling="2026-01-10 16:41:22.074018141 +0000 UTC m=+803.944253635" observedRunningTime="2026-01-10 16:41:23.902020205 +0000 UTC m=+805.772255699" watchObservedRunningTime="2026-01-10 16:41:23.916371066 +0000 UTC m=+805.786606550" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.916537 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5b47c74dd5-skh8x" podStartSLOduration=18.367730458 podStartE2EDuration="40.916531351s" podCreationTimestamp="2026-01-10 16:40:43 +0000 UTC" firstStartedPulling="2026-01-10 16:40:45.051902251 +0000 UTC m=+766.922137745" lastFinishedPulling="2026-01-10 16:41:07.600703144 +0000 UTC m=+789.470938638" observedRunningTime="2026-01-10 16:41:23.866981862 +0000 UTC m=+805.737217356" watchObservedRunningTime="2026-01-10 16:41:23.916531351 +0000 UTC m=+805.786766845" Jan 10 16:41:23 crc kubenswrapper[5036]: I0110 16:41:23.948873 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5967c8645c-cbdjv" podStartSLOduration=3.774142515 podStartE2EDuration="40.948854046s" podCreationTimestamp="2026-01-10 16:40:43 +0000 UTC" firstStartedPulling="2026-01-10 16:40:44.970989553 +0000 UTC m=+766.841225047" lastFinishedPulling="2026-01-10 16:41:22.145701084 +0000 UTC m=+804.015936578" observedRunningTime="2026-01-10 16:41:23.944890403 +0000 UTC m=+805.815125897" watchObservedRunningTime="2026-01-10 16:41:23.948854046 +0000 UTC m=+805.819089540" Jan 10 16:41:24 crc kubenswrapper[5036]: I0110 16:41:24.058223 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" podStartSLOduration=41.058205448 podStartE2EDuration="41.058205448s" podCreationTimestamp="2026-01-10 16:40:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:41:24.025970455 +0000 UTC m=+805.896205959" watchObservedRunningTime="2026-01-10 16:41:24.058205448 +0000 UTC m=+805.928440942" Jan 10 16:41:24 crc kubenswrapper[5036]: I0110 16:41:24.099338 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-t7qtf" podStartSLOduration=4.240012985 podStartE2EDuration="41.099319196s" podCreationTimestamp="2026-01-10 16:40:43 +0000 UTC" firstStartedPulling="2026-01-10 16:40:45.459467219 +0000 UTC m=+767.329702713" lastFinishedPulling="2026-01-10 16:41:22.31877344 +0000 UTC m=+804.189008924" observedRunningTime="2026-01-10 16:41:24.063276463 +0000 UTC m=+805.933511957" watchObservedRunningTime="2026-01-10 16:41:24.099319196 +0000 UTC m=+805.969554690" Jan 10 16:41:24 crc kubenswrapper[5036]: I0110 16:41:24.103582 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-568985c78-cs2b2" podStartSLOduration=3.998103047 podStartE2EDuration="41.103572058s" podCreationTimestamp="2026-01-10 16:40:43 +0000 UTC" firstStartedPulling="2026-01-10 16:40:45.02966782 +0000 UTC m=+766.899903314" lastFinishedPulling="2026-01-10 16:41:22.135136831 +0000 UTC m=+804.005372325" observedRunningTime="2026-01-10 16:41:24.097140123 +0000 UTC m=+805.967375617" watchObservedRunningTime="2026-01-10 16:41:24.103572058 +0000 UTC m=+805.973807552" Jan 10 16:41:24 crc kubenswrapper[5036]: I0110 16:41:24.152008 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-78979fc445-2qq47" podStartSLOduration=3.724037911 podStartE2EDuration="41.151990594s" podCreationTimestamp="2026-01-10 16:40:43 +0000 UTC" firstStartedPulling="2026-01-10 16:40:44.576847095 +0000 UTC m=+766.447082589" lastFinishedPulling="2026-01-10 16:41:22.004799768 +0000 UTC m=+803.875035272" observedRunningTime="2026-01-10 16:41:24.118952508 +0000 UTC m=+805.989187992" watchObservedRunningTime="2026-01-10 16:41:24.151990594 +0000 UTC m=+806.022226088" Jan 10 16:41:24 crc kubenswrapper[5036]: I0110 16:41:24.152947 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-trzdf" podStartSLOduration=17.137340852 podStartE2EDuration="41.152940462s" podCreationTimestamp="2026-01-10 16:40:43 +0000 UTC" firstStartedPulling="2026-01-10 16:40:44.693668474 +0000 UTC m=+766.563903968" lastFinishedPulling="2026-01-10 16:41:08.709268084 +0000 UTC m=+790.579503578" observedRunningTime="2026-01-10 16:41:24.149097631 +0000 UTC m=+806.019333125" watchObservedRunningTime="2026-01-10 16:41:24.152940462 +0000 UTC m=+806.023175956" Jan 10 16:41:24 crc kubenswrapper[5036]: I0110 16:41:24.175298 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-nbdkb" podStartSLOduration=18.989192032 podStartE2EDuration="41.175282411s" podCreationTimestamp="2026-01-10 16:40:43 +0000 UTC" firstStartedPulling="2026-01-10 16:40:45.412217657 +0000 UTC m=+767.282453151" lastFinishedPulling="2026-01-10 16:41:07.598308036 +0000 UTC m=+789.468543530" observedRunningTime="2026-01-10 16:41:24.174472168 +0000 UTC m=+806.044707692" watchObservedRunningTime="2026-01-10 16:41:24.175282411 +0000 UTC m=+806.045517905" Jan 10 16:41:24 crc kubenswrapper[5036]: I0110 16:41:24.196751 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-9dbdf6486-fwmft" podStartSLOduration=19.645579904999998 podStartE2EDuration="41.196733946s" podCreationTimestamp="2026-01-10 16:40:43 +0000 UTC" firstStartedPulling="2026-01-10 16:40:46.049561314 +0000 UTC m=+767.919796808" lastFinishedPulling="2026-01-10 16:41:07.600715355 +0000 UTC m=+789.470950849" observedRunningTime="2026-01-10 16:41:24.19550233 +0000 UTC m=+806.065737824" watchObservedRunningTime="2026-01-10 16:41:24.196733946 +0000 UTC m=+806.066969440" Jan 10 16:41:24 crc kubenswrapper[5036]: I0110 16:41:24.215717 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-wv445" podStartSLOduration=18.672590218 podStartE2EDuration="41.215701669s" podCreationTimestamp="2026-01-10 16:40:43 +0000 UTC" firstStartedPulling="2026-01-10 16:40:45.057562542 +0000 UTC m=+766.927798036" lastFinishedPulling="2026-01-10 16:41:07.600673993 +0000 UTC m=+789.470909487" observedRunningTime="2026-01-10 16:41:24.215092952 +0000 UTC m=+806.085328446" watchObservedRunningTime="2026-01-10 16:41:24.215701669 +0000 UTC m=+806.085937163" Jan 10 16:41:24 crc kubenswrapper[5036]: I0110 16:41:24.251020 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-68d988df55-88zlb" podStartSLOduration=5.488116025 podStartE2EDuration="41.25100339s" podCreationTimestamp="2026-01-10 16:40:43 +0000 UTC" firstStartedPulling="2026-01-10 16:40:46.094781359 +0000 UTC m=+767.965016853" lastFinishedPulling="2026-01-10 16:41:21.857668704 +0000 UTC m=+803.727904218" observedRunningTime="2026-01-10 16:41:24.245369129 +0000 UTC m=+806.115604623" watchObservedRunningTime="2026-01-10 16:41:24.25100339 +0000 UTC m=+806.121238884" Jan 10 16:41:25 crc kubenswrapper[5036]: I0110 16:41:25.904670 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 16:41:25 crc kubenswrapper[5036]: I0110 16:41:25.905099 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 16:41:26 crc kubenswrapper[5036]: I0110 16:41:26.767781 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xcmds" event={"ID":"80ddf12b-ee61-4d6f-a3fb-ff9aded793d7","Type":"ContainerStarted","Data":"3684df063692a7e42ec9b8b6785f3116ff8113b639bfaeecf61be41acb6b6dd2"} Jan 10 16:41:26 crc kubenswrapper[5036]: I0110 16:41:26.768161 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xcmds" Jan 10 16:41:26 crc kubenswrapper[5036]: I0110 16:41:26.769731 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" event={"ID":"f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1","Type":"ContainerStarted","Data":"957ff4eb3bdf4574fa0e26cbbaa2eeca3982a22627129fc913bb4403bd3a186d"} Jan 10 16:41:26 crc kubenswrapper[5036]: I0110 16:41:26.769982 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" Jan 10 16:41:26 crc kubenswrapper[5036]: I0110 16:41:26.797047 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xcmds" podStartSLOduration=39.353057423 podStartE2EDuration="43.797021398s" podCreationTimestamp="2026-01-10 16:40:43 +0000 UTC" firstStartedPulling="2026-01-10 16:41:21.859197708 +0000 UTC m=+803.729433212" lastFinishedPulling="2026-01-10 16:41:26.303161703 +0000 UTC m=+808.173397187" observedRunningTime="2026-01-10 16:41:26.794438034 +0000 UTC m=+808.664673538" watchObservedRunningTime="2026-01-10 16:41:26.797021398 +0000 UTC m=+808.667256912" Jan 10 16:41:26 crc kubenswrapper[5036]: I0110 16:41:26.861438 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" podStartSLOduration=40.095359202 podStartE2EDuration="43.861402652s" podCreationTimestamp="2026-01-10 16:40:43 +0000 UTC" firstStartedPulling="2026-01-10 16:41:22.540018367 +0000 UTC m=+804.410253861" lastFinishedPulling="2026-01-10 16:41:26.306061827 +0000 UTC m=+808.176297311" observedRunningTime="2026-01-10 16:41:26.841972585 +0000 UTC m=+808.712208089" watchObservedRunningTime="2026-01-10 16:41:26.861402652 +0000 UTC m=+808.731638186" Jan 10 16:41:32 crc kubenswrapper[5036]: I0110 16:41:32.680337 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fdxr5"] Jan 10 16:41:32 crc kubenswrapper[5036]: I0110 16:41:32.682866 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fdxr5" Jan 10 16:41:32 crc kubenswrapper[5036]: I0110 16:41:32.698801 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fdxr5"] Jan 10 16:41:32 crc kubenswrapper[5036]: I0110 16:41:32.770909 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f74a5a6c-2ffd-433c-865b-2c81a65485a6-catalog-content\") pod \"certified-operators-fdxr5\" (UID: \"f74a5a6c-2ffd-433c-865b-2c81a65485a6\") " pod="openshift-marketplace/certified-operators-fdxr5" Jan 10 16:41:32 crc kubenswrapper[5036]: I0110 16:41:32.771038 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f74a5a6c-2ffd-433c-865b-2c81a65485a6-utilities\") pod \"certified-operators-fdxr5\" (UID: \"f74a5a6c-2ffd-433c-865b-2c81a65485a6\") " pod="openshift-marketplace/certified-operators-fdxr5" Jan 10 16:41:32 crc kubenswrapper[5036]: I0110 16:41:32.771061 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xp6n\" (UniqueName: \"kubernetes.io/projected/f74a5a6c-2ffd-433c-865b-2c81a65485a6-kube-api-access-4xp6n\") pod \"certified-operators-fdxr5\" (UID: \"f74a5a6c-2ffd-433c-865b-2c81a65485a6\") " pod="openshift-marketplace/certified-operators-fdxr5" Jan 10 16:41:32 crc kubenswrapper[5036]: I0110 16:41:32.871903 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f74a5a6c-2ffd-433c-865b-2c81a65485a6-catalog-content\") pod \"certified-operators-fdxr5\" (UID: \"f74a5a6c-2ffd-433c-865b-2c81a65485a6\") " pod="openshift-marketplace/certified-operators-fdxr5" Jan 10 16:41:32 crc kubenswrapper[5036]: I0110 16:41:32.871998 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f74a5a6c-2ffd-433c-865b-2c81a65485a6-utilities\") pod \"certified-operators-fdxr5\" (UID: \"f74a5a6c-2ffd-433c-865b-2c81a65485a6\") " pod="openshift-marketplace/certified-operators-fdxr5" Jan 10 16:41:32 crc kubenswrapper[5036]: I0110 16:41:32.872020 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xp6n\" (UniqueName: \"kubernetes.io/projected/f74a5a6c-2ffd-433c-865b-2c81a65485a6-kube-api-access-4xp6n\") pod \"certified-operators-fdxr5\" (UID: \"f74a5a6c-2ffd-433c-865b-2c81a65485a6\") " pod="openshift-marketplace/certified-operators-fdxr5" Jan 10 16:41:32 crc kubenswrapper[5036]: I0110 16:41:32.872815 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f74a5a6c-2ffd-433c-865b-2c81a65485a6-utilities\") pod \"certified-operators-fdxr5\" (UID: \"f74a5a6c-2ffd-433c-865b-2c81a65485a6\") " pod="openshift-marketplace/certified-operators-fdxr5" Jan 10 16:41:32 crc kubenswrapper[5036]: I0110 16:41:32.872837 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f74a5a6c-2ffd-433c-865b-2c81a65485a6-catalog-content\") pod \"certified-operators-fdxr5\" (UID: \"f74a5a6c-2ffd-433c-865b-2c81a65485a6\") " pod="openshift-marketplace/certified-operators-fdxr5" Jan 10 16:41:32 crc kubenswrapper[5036]: I0110 16:41:32.902523 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xp6n\" (UniqueName: \"kubernetes.io/projected/f74a5a6c-2ffd-433c-865b-2c81a65485a6-kube-api-access-4xp6n\") pod \"certified-operators-fdxr5\" (UID: \"f74a5a6c-2ffd-433c-865b-2c81a65485a6\") " pod="openshift-marketplace/certified-operators-fdxr5" Jan 10 16:41:33 crc kubenswrapper[5036]: I0110 16:41:33.007177 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fdxr5" Jan 10 16:41:33 crc kubenswrapper[5036]: I0110 16:41:33.491573 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fdxr5"] Jan 10 16:41:33 crc kubenswrapper[5036]: W0110 16:41:33.501498 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf74a5a6c_2ffd_433c_865b_2c81a65485a6.slice/crio-bfe73b44cfbf352c24fc02cbd6a0a471fd97f5ae60b4b40cda0e9579788655c0 WatchSource:0}: Error finding container bfe73b44cfbf352c24fc02cbd6a0a471fd97f5ae60b4b40cda0e9579788655c0: Status 404 returned error can't find the container with id bfe73b44cfbf352c24fc02cbd6a0a471fd97f5ae60b4b40cda0e9579788655c0 Jan 10 16:41:33 crc kubenswrapper[5036]: I0110 16:41:33.639493 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-678b8c6d96-568pc" Jan 10 16:41:33 crc kubenswrapper[5036]: I0110 16:41:33.696004 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-78979fc445-2qq47" Jan 10 16:41:33 crc kubenswrapper[5036]: I0110 16:41:33.720191 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-trzdf" Jan 10 16:41:33 crc kubenswrapper[5036]: I0110 16:41:33.750245 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5967c8645c-cbdjv" Jan 10 16:41:33 crc kubenswrapper[5036]: I0110 16:41:33.843179 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-65c54c675d-ng9ld" Jan 10 16:41:33 crc kubenswrapper[5036]: I0110 16:41:33.851001 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-7998b4cc7b-bjxnm" Jan 10 16:41:33 crc kubenswrapper[5036]: I0110 16:41:33.996475 5036 generic.go:334] "Generic (PLEG): container finished" podID="f74a5a6c-2ffd-433c-865b-2c81a65485a6" containerID="08477e3d4fa6cf2008d74bbc904e421f0d0d9ff21668d28de2c111d25c61182d" exitCode=0 Jan 10 16:41:33 crc kubenswrapper[5036]: I0110 16:41:33.996512 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fdxr5" event={"ID":"f74a5a6c-2ffd-433c-865b-2c81a65485a6","Type":"ContainerDied","Data":"08477e3d4fa6cf2008d74bbc904e421f0d0d9ff21668d28de2c111d25c61182d"} Jan 10 16:41:33 crc kubenswrapper[5036]: I0110 16:41:33.996537 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fdxr5" event={"ID":"f74a5a6c-2ffd-433c-865b-2c81a65485a6","Type":"ContainerStarted","Data":"bfe73b44cfbf352c24fc02cbd6a0a471fd97f5ae60b4b40cda0e9579788655c0"} Jan 10 16:41:34 crc kubenswrapper[5036]: I0110 16:41:34.063728 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-568985c78-cs2b2" Jan 10 16:41:34 crc kubenswrapper[5036]: I0110 16:41:34.124844 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5b47c74dd5-skh8x" Jan 10 16:41:34 crc kubenswrapper[5036]: I0110 16:41:34.181461 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-wv445" Jan 10 16:41:34 crc kubenswrapper[5036]: I0110 16:41:34.194381 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-598945d5b8-l8ggv" Jan 10 16:41:34 crc kubenswrapper[5036]: I0110 16:41:34.306722 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-bb586bbf4-tn7cg" Jan 10 16:41:34 crc kubenswrapper[5036]: I0110 16:41:34.342488 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-746ccdd857-kkjhp" Jan 10 16:41:34 crc kubenswrapper[5036]: I0110 16:41:34.353712 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-68d988df55-88zlb" Jan 10 16:41:34 crc kubenswrapper[5036]: I0110 16:41:34.374506 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-nbdkb" Jan 10 16:41:34 crc kubenswrapper[5036]: I0110 16:41:34.379051 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-9dbdf6486-fwmft" Jan 10 16:41:34 crc kubenswrapper[5036]: I0110 16:41:34.413999 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-t7qtf" Jan 10 16:41:34 crc kubenswrapper[5036]: I0110 16:41:34.453032 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-4t295" Jan 10 16:41:34 crc kubenswrapper[5036]: I0110 16:41:34.473798 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-zz7v2" Jan 10 16:41:34 crc kubenswrapper[5036]: I0110 16:41:34.526358 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-84587ffc8-l7b7s" Jan 10 16:41:35 crc kubenswrapper[5036]: I0110 16:41:35.005295 5036 generic.go:334] "Generic (PLEG): container finished" podID="f74a5a6c-2ffd-433c-865b-2c81a65485a6" containerID="6cafc7c9b0ec3f67640c757334dcfa5b93c1710bc58fc4fdc59e26831c96cf74" exitCode=0 Jan 10 16:41:35 crc kubenswrapper[5036]: I0110 16:41:35.005350 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fdxr5" event={"ID":"f74a5a6c-2ffd-433c-865b-2c81a65485a6","Type":"ContainerDied","Data":"6cafc7c9b0ec3f67640c757334dcfa5b93c1710bc58fc4fdc59e26831c96cf74"} Jan 10 16:41:35 crc kubenswrapper[5036]: E0110 16:41:35.509595 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcjhz" podUID="4ddc3dbc-f7b1-4627-9740-9e2f5c0296fd" Jan 10 16:41:36 crc kubenswrapper[5036]: I0110 16:41:36.016099 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fdxr5" event={"ID":"f74a5a6c-2ffd-433c-865b-2c81a65485a6","Type":"ContainerStarted","Data":"ca34a96b230c664822418d69f8e5532720cb55a9999203547395e990d9b21394"} Jan 10 16:41:36 crc kubenswrapper[5036]: I0110 16:41:36.035862 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fdxr5" podStartSLOduration=2.546294887 podStartE2EDuration="4.035835318s" podCreationTimestamp="2026-01-10 16:41:32 +0000 UTC" firstStartedPulling="2026-01-10 16:41:33.997984273 +0000 UTC m=+815.868219767" lastFinishedPulling="2026-01-10 16:41:35.487524704 +0000 UTC m=+817.357760198" observedRunningTime="2026-01-10 16:41:36.033226183 +0000 UTC m=+817.903461687" watchObservedRunningTime="2026-01-10 16:41:36.035835318 +0000 UTC m=+817.906070842" Jan 10 16:41:36 crc kubenswrapper[5036]: I0110 16:41:36.106837 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5b4889549f2j7sh" Jan 10 16:41:36 crc kubenswrapper[5036]: I0110 16:41:36.520175 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-56458c9ddd-p4bsn" Jan 10 16:41:39 crc kubenswrapper[5036]: I0110 16:41:39.655939 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xcmds" Jan 10 16:41:39 crc kubenswrapper[5036]: I0110 16:41:39.758783 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wz8nd"] Jan 10 16:41:39 crc kubenswrapper[5036]: I0110 16:41:39.760430 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wz8nd" Jan 10 16:41:39 crc kubenswrapper[5036]: I0110 16:41:39.763174 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp47r\" (UniqueName: \"kubernetes.io/projected/c6de1417-c36c-4cbc-863e-15dc71f85afd-kube-api-access-qp47r\") pod \"redhat-marketplace-wz8nd\" (UID: \"c6de1417-c36c-4cbc-863e-15dc71f85afd\") " pod="openshift-marketplace/redhat-marketplace-wz8nd" Jan 10 16:41:39 crc kubenswrapper[5036]: I0110 16:41:39.763260 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6de1417-c36c-4cbc-863e-15dc71f85afd-catalog-content\") pod \"redhat-marketplace-wz8nd\" (UID: \"c6de1417-c36c-4cbc-863e-15dc71f85afd\") " pod="openshift-marketplace/redhat-marketplace-wz8nd" Jan 10 16:41:39 crc kubenswrapper[5036]: I0110 16:41:39.763343 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6de1417-c36c-4cbc-863e-15dc71f85afd-utilities\") pod \"redhat-marketplace-wz8nd\" (UID: \"c6de1417-c36c-4cbc-863e-15dc71f85afd\") " pod="openshift-marketplace/redhat-marketplace-wz8nd" Jan 10 16:41:39 crc kubenswrapper[5036]: I0110 16:41:39.773438 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wz8nd"] Jan 10 16:41:39 crc kubenswrapper[5036]: I0110 16:41:39.864305 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6de1417-c36c-4cbc-863e-15dc71f85afd-utilities\") pod \"redhat-marketplace-wz8nd\" (UID: \"c6de1417-c36c-4cbc-863e-15dc71f85afd\") " pod="openshift-marketplace/redhat-marketplace-wz8nd" Jan 10 16:41:39 crc kubenswrapper[5036]: I0110 16:41:39.864380 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp47r\" (UniqueName: \"kubernetes.io/projected/c6de1417-c36c-4cbc-863e-15dc71f85afd-kube-api-access-qp47r\") pod \"redhat-marketplace-wz8nd\" (UID: \"c6de1417-c36c-4cbc-863e-15dc71f85afd\") " pod="openshift-marketplace/redhat-marketplace-wz8nd" Jan 10 16:41:39 crc kubenswrapper[5036]: I0110 16:41:39.864433 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6de1417-c36c-4cbc-863e-15dc71f85afd-catalog-content\") pod \"redhat-marketplace-wz8nd\" (UID: \"c6de1417-c36c-4cbc-863e-15dc71f85afd\") " pod="openshift-marketplace/redhat-marketplace-wz8nd" Jan 10 16:41:39 crc kubenswrapper[5036]: I0110 16:41:39.864891 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6de1417-c36c-4cbc-863e-15dc71f85afd-utilities\") pod \"redhat-marketplace-wz8nd\" (UID: \"c6de1417-c36c-4cbc-863e-15dc71f85afd\") " pod="openshift-marketplace/redhat-marketplace-wz8nd" Jan 10 16:41:39 crc kubenswrapper[5036]: I0110 16:41:39.865004 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6de1417-c36c-4cbc-863e-15dc71f85afd-catalog-content\") pod \"redhat-marketplace-wz8nd\" (UID: \"c6de1417-c36c-4cbc-863e-15dc71f85afd\") " pod="openshift-marketplace/redhat-marketplace-wz8nd" Jan 10 16:41:39 crc kubenswrapper[5036]: I0110 16:41:39.885624 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp47r\" (UniqueName: \"kubernetes.io/projected/c6de1417-c36c-4cbc-863e-15dc71f85afd-kube-api-access-qp47r\") pod \"redhat-marketplace-wz8nd\" (UID: \"c6de1417-c36c-4cbc-863e-15dc71f85afd\") " pod="openshift-marketplace/redhat-marketplace-wz8nd" Jan 10 16:41:40 crc kubenswrapper[5036]: I0110 16:41:40.076488 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wz8nd" Jan 10 16:41:40 crc kubenswrapper[5036]: I0110 16:41:40.506841 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wz8nd"] Jan 10 16:41:41 crc kubenswrapper[5036]: I0110 16:41:41.049117 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wz8nd" event={"ID":"c6de1417-c36c-4cbc-863e-15dc71f85afd","Type":"ContainerStarted","Data":"52591e78ff6ddb4ab76d131da33eaef86a70c317f915671285892af1164b6145"} Jan 10 16:41:43 crc kubenswrapper[5036]: I0110 16:41:43.007775 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fdxr5" Jan 10 16:41:43 crc kubenswrapper[5036]: I0110 16:41:43.007847 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fdxr5" Jan 10 16:41:43 crc kubenswrapper[5036]: I0110 16:41:43.070114 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fdxr5" Jan 10 16:41:43 crc kubenswrapper[5036]: I0110 16:41:43.116148 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fdxr5" Jan 10 16:41:44 crc kubenswrapper[5036]: I0110 16:41:44.452533 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fdxr5"] Jan 10 16:41:45 crc kubenswrapper[5036]: I0110 16:41:45.073792 5036 generic.go:334] "Generic (PLEG): container finished" podID="c6de1417-c36c-4cbc-863e-15dc71f85afd" containerID="b34fdda6a0e3d077205f7ae618f09169ddd1838ecd4abe7da4b7664d59e618cb" exitCode=0 Jan 10 16:41:45 crc kubenswrapper[5036]: I0110 16:41:45.073854 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wz8nd" event={"ID":"c6de1417-c36c-4cbc-863e-15dc71f85afd","Type":"ContainerDied","Data":"b34fdda6a0e3d077205f7ae618f09169ddd1838ecd4abe7da4b7664d59e618cb"} Jan 10 16:41:45 crc kubenswrapper[5036]: I0110 16:41:45.073993 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fdxr5" podUID="f74a5a6c-2ffd-433c-865b-2c81a65485a6" containerName="registry-server" containerID="cri-o://ca34a96b230c664822418d69f8e5532720cb55a9999203547395e990d9b21394" gracePeriod=2 Jan 10 16:41:45 crc kubenswrapper[5036]: I0110 16:41:45.075429 5036 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 10 16:41:46 crc kubenswrapper[5036]: I0110 16:41:46.264150 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pbgv7"] Jan 10 16:41:46 crc kubenswrapper[5036]: I0110 16:41:46.272331 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbgv7" Jan 10 16:41:46 crc kubenswrapper[5036]: I0110 16:41:46.281718 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pbgv7"] Jan 10 16:41:46 crc kubenswrapper[5036]: I0110 16:41:46.453270 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68b761b4-18eb-49f6-b9ca-7c9865ab6aba-utilities\") pod \"redhat-operators-pbgv7\" (UID: \"68b761b4-18eb-49f6-b9ca-7c9865ab6aba\") " pod="openshift-marketplace/redhat-operators-pbgv7" Jan 10 16:41:46 crc kubenswrapper[5036]: I0110 16:41:46.453583 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68b761b4-18eb-49f6-b9ca-7c9865ab6aba-catalog-content\") pod \"redhat-operators-pbgv7\" (UID: \"68b761b4-18eb-49f6-b9ca-7c9865ab6aba\") " pod="openshift-marketplace/redhat-operators-pbgv7" Jan 10 16:41:46 crc kubenswrapper[5036]: I0110 16:41:46.453712 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gslx\" (UniqueName: \"kubernetes.io/projected/68b761b4-18eb-49f6-b9ca-7c9865ab6aba-kube-api-access-2gslx\") pod \"redhat-operators-pbgv7\" (UID: \"68b761b4-18eb-49f6-b9ca-7c9865ab6aba\") " pod="openshift-marketplace/redhat-operators-pbgv7" Jan 10 16:41:46 crc kubenswrapper[5036]: I0110 16:41:46.554884 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68b761b4-18eb-49f6-b9ca-7c9865ab6aba-catalog-content\") pod \"redhat-operators-pbgv7\" (UID: \"68b761b4-18eb-49f6-b9ca-7c9865ab6aba\") " pod="openshift-marketplace/redhat-operators-pbgv7" Jan 10 16:41:46 crc kubenswrapper[5036]: I0110 16:41:46.554946 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gslx\" (UniqueName: \"kubernetes.io/projected/68b761b4-18eb-49f6-b9ca-7c9865ab6aba-kube-api-access-2gslx\") pod \"redhat-operators-pbgv7\" (UID: \"68b761b4-18eb-49f6-b9ca-7c9865ab6aba\") " pod="openshift-marketplace/redhat-operators-pbgv7" Jan 10 16:41:46 crc kubenswrapper[5036]: I0110 16:41:46.555025 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68b761b4-18eb-49f6-b9ca-7c9865ab6aba-utilities\") pod \"redhat-operators-pbgv7\" (UID: \"68b761b4-18eb-49f6-b9ca-7c9865ab6aba\") " pod="openshift-marketplace/redhat-operators-pbgv7" Jan 10 16:41:46 crc kubenswrapper[5036]: I0110 16:41:46.555538 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68b761b4-18eb-49f6-b9ca-7c9865ab6aba-utilities\") pod \"redhat-operators-pbgv7\" (UID: \"68b761b4-18eb-49f6-b9ca-7c9865ab6aba\") " pod="openshift-marketplace/redhat-operators-pbgv7" Jan 10 16:41:46 crc kubenswrapper[5036]: I0110 16:41:46.555570 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68b761b4-18eb-49f6-b9ca-7c9865ab6aba-catalog-content\") pod \"redhat-operators-pbgv7\" (UID: \"68b761b4-18eb-49f6-b9ca-7c9865ab6aba\") " pod="openshift-marketplace/redhat-operators-pbgv7" Jan 10 16:41:46 crc kubenswrapper[5036]: I0110 16:41:46.583905 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gslx\" (UniqueName: \"kubernetes.io/projected/68b761b4-18eb-49f6-b9ca-7c9865ab6aba-kube-api-access-2gslx\") pod \"redhat-operators-pbgv7\" (UID: \"68b761b4-18eb-49f6-b9ca-7c9865ab6aba\") " pod="openshift-marketplace/redhat-operators-pbgv7" Jan 10 16:41:46 crc kubenswrapper[5036]: I0110 16:41:46.604351 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbgv7" Jan 10 16:41:47 crc kubenswrapper[5036]: I0110 16:41:47.091649 5036 generic.go:334] "Generic (PLEG): container finished" podID="f74a5a6c-2ffd-433c-865b-2c81a65485a6" containerID="ca34a96b230c664822418d69f8e5532720cb55a9999203547395e990d9b21394" exitCode=0 Jan 10 16:41:47 crc kubenswrapper[5036]: I0110 16:41:47.091815 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fdxr5" event={"ID":"f74a5a6c-2ffd-433c-865b-2c81a65485a6","Type":"ContainerDied","Data":"ca34a96b230c664822418d69f8e5532720cb55a9999203547395e990d9b21394"} Jan 10 16:41:47 crc kubenswrapper[5036]: I0110 16:41:47.120011 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pbgv7"] Jan 10 16:41:47 crc kubenswrapper[5036]: I0110 16:41:47.385200 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fdxr5" Jan 10 16:41:47 crc kubenswrapper[5036]: I0110 16:41:47.579275 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f74a5a6c-2ffd-433c-865b-2c81a65485a6-utilities\") pod \"f74a5a6c-2ffd-433c-865b-2c81a65485a6\" (UID: \"f74a5a6c-2ffd-433c-865b-2c81a65485a6\") " Jan 10 16:41:47 crc kubenswrapper[5036]: I0110 16:41:47.579653 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f74a5a6c-2ffd-433c-865b-2c81a65485a6-catalog-content\") pod \"f74a5a6c-2ffd-433c-865b-2c81a65485a6\" (UID: \"f74a5a6c-2ffd-433c-865b-2c81a65485a6\") " Jan 10 16:41:47 crc kubenswrapper[5036]: I0110 16:41:47.579782 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xp6n\" (UniqueName: \"kubernetes.io/projected/f74a5a6c-2ffd-433c-865b-2c81a65485a6-kube-api-access-4xp6n\") pod \"f74a5a6c-2ffd-433c-865b-2c81a65485a6\" (UID: \"f74a5a6c-2ffd-433c-865b-2c81a65485a6\") " Jan 10 16:41:47 crc kubenswrapper[5036]: I0110 16:41:47.580289 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f74a5a6c-2ffd-433c-865b-2c81a65485a6-utilities" (OuterVolumeSpecName: "utilities") pod "f74a5a6c-2ffd-433c-865b-2c81a65485a6" (UID: "f74a5a6c-2ffd-433c-865b-2c81a65485a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:41:47 crc kubenswrapper[5036]: I0110 16:41:47.584883 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f74a5a6c-2ffd-433c-865b-2c81a65485a6-kube-api-access-4xp6n" (OuterVolumeSpecName: "kube-api-access-4xp6n") pod "f74a5a6c-2ffd-433c-865b-2c81a65485a6" (UID: "f74a5a6c-2ffd-433c-865b-2c81a65485a6"). InnerVolumeSpecName "kube-api-access-4xp6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:41:47 crc kubenswrapper[5036]: I0110 16:41:47.628767 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f74a5a6c-2ffd-433c-865b-2c81a65485a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f74a5a6c-2ffd-433c-865b-2c81a65485a6" (UID: "f74a5a6c-2ffd-433c-865b-2c81a65485a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:41:47 crc kubenswrapper[5036]: I0110 16:41:47.681402 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f74a5a6c-2ffd-433c-865b-2c81a65485a6-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 16:41:47 crc kubenswrapper[5036]: I0110 16:41:47.681436 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f74a5a6c-2ffd-433c-865b-2c81a65485a6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 16:41:47 crc kubenswrapper[5036]: I0110 16:41:47.681449 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xp6n\" (UniqueName: \"kubernetes.io/projected/f74a5a6c-2ffd-433c-865b-2c81a65485a6-kube-api-access-4xp6n\") on node \"crc\" DevicePath \"\"" Jan 10 16:41:48 crc kubenswrapper[5036]: I0110 16:41:48.106842 5036 generic.go:334] "Generic (PLEG): container finished" podID="c6de1417-c36c-4cbc-863e-15dc71f85afd" containerID="8aef4ab7455f8de0daed903127095a521547ead70c076a586ffa64eb34136e4e" exitCode=0 Jan 10 16:41:48 crc kubenswrapper[5036]: I0110 16:41:48.106905 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wz8nd" event={"ID":"c6de1417-c36c-4cbc-863e-15dc71f85afd","Type":"ContainerDied","Data":"8aef4ab7455f8de0daed903127095a521547ead70c076a586ffa64eb34136e4e"} Jan 10 16:41:48 crc kubenswrapper[5036]: I0110 16:41:48.111459 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fdxr5" event={"ID":"f74a5a6c-2ffd-433c-865b-2c81a65485a6","Type":"ContainerDied","Data":"bfe73b44cfbf352c24fc02cbd6a0a471fd97f5ae60b4b40cda0e9579788655c0"} Jan 10 16:41:48 crc kubenswrapper[5036]: I0110 16:41:48.111496 5036 scope.go:117] "RemoveContainer" containerID="ca34a96b230c664822418d69f8e5532720cb55a9999203547395e990d9b21394" Jan 10 16:41:48 crc kubenswrapper[5036]: I0110 16:41:48.111616 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fdxr5" Jan 10 16:41:48 crc kubenswrapper[5036]: I0110 16:41:48.113941 5036 generic.go:334] "Generic (PLEG): container finished" podID="68b761b4-18eb-49f6-b9ca-7c9865ab6aba" containerID="fd35aba0352d28f32fbd073e418bd5390e3625e4f59f5452211be3378c02366e" exitCode=0 Jan 10 16:41:48 crc kubenswrapper[5036]: I0110 16:41:48.114048 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbgv7" event={"ID":"68b761b4-18eb-49f6-b9ca-7c9865ab6aba","Type":"ContainerDied","Data":"fd35aba0352d28f32fbd073e418bd5390e3625e4f59f5452211be3378c02366e"} Jan 10 16:41:48 crc kubenswrapper[5036]: I0110 16:41:48.114307 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbgv7" event={"ID":"68b761b4-18eb-49f6-b9ca-7c9865ab6aba","Type":"ContainerStarted","Data":"fa398bdadccbbe9ed29a9ff5b6c85a777e8cbe9d810e4bb33e35137f92a7a12d"} Jan 10 16:41:48 crc kubenswrapper[5036]: I0110 16:41:48.135559 5036 scope.go:117] "RemoveContainer" containerID="6cafc7c9b0ec3f67640c757334dcfa5b93c1710bc58fc4fdc59e26831c96cf74" Jan 10 16:41:48 crc kubenswrapper[5036]: I0110 16:41:48.148853 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fdxr5"] Jan 10 16:41:48 crc kubenswrapper[5036]: I0110 16:41:48.155115 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fdxr5"] Jan 10 16:41:48 crc kubenswrapper[5036]: I0110 16:41:48.165112 5036 scope.go:117] "RemoveContainer" containerID="08477e3d4fa6cf2008d74bbc904e421f0d0d9ff21668d28de2c111d25c61182d" Jan 10 16:41:48 crc kubenswrapper[5036]: I0110 16:41:48.517443 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f74a5a6c-2ffd-433c-865b-2c81a65485a6" path="/var/lib/kubelet/pods/f74a5a6c-2ffd-433c-865b-2c81a65485a6/volumes" Jan 10 16:41:49 crc kubenswrapper[5036]: I0110 16:41:49.121949 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wz8nd" event={"ID":"c6de1417-c36c-4cbc-863e-15dc71f85afd","Type":"ContainerStarted","Data":"711eedc99aba949697e7220482e452167782186d94ae22acb2d0133f45c1964f"} Jan 10 16:41:49 crc kubenswrapper[5036]: I0110 16:41:49.124208 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbgv7" event={"ID":"68b761b4-18eb-49f6-b9ca-7c9865ab6aba","Type":"ContainerStarted","Data":"c4c5f933dc7997144a246982ce7f6c1a0b4b84c4cb5ab1310fa49d301d4d1f69"} Jan 10 16:41:49 crc kubenswrapper[5036]: I0110 16:41:49.139659 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wz8nd" podStartSLOduration=6.522425196 podStartE2EDuration="10.139639623s" podCreationTimestamp="2026-01-10 16:41:39 +0000 UTC" firstStartedPulling="2026-01-10 16:41:45.075183067 +0000 UTC m=+826.945418561" lastFinishedPulling="2026-01-10 16:41:48.692397494 +0000 UTC m=+830.562632988" observedRunningTime="2026-01-10 16:41:49.137880613 +0000 UTC m=+831.008116107" watchObservedRunningTime="2026-01-10 16:41:49.139639623 +0000 UTC m=+831.009875117" Jan 10 16:41:50 crc kubenswrapper[5036]: I0110 16:41:50.076917 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wz8nd" Jan 10 16:41:50 crc kubenswrapper[5036]: I0110 16:41:50.076977 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wz8nd" Jan 10 16:41:50 crc kubenswrapper[5036]: I0110 16:41:50.132058 5036 generic.go:334] "Generic (PLEG): container finished" podID="68b761b4-18eb-49f6-b9ca-7c9865ab6aba" containerID="c4c5f933dc7997144a246982ce7f6c1a0b4b84c4cb5ab1310fa49d301d4d1f69" exitCode=0 Jan 10 16:41:50 crc kubenswrapper[5036]: I0110 16:41:50.132104 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbgv7" event={"ID":"68b761b4-18eb-49f6-b9ca-7c9865ab6aba","Type":"ContainerDied","Data":"c4c5f933dc7997144a246982ce7f6c1a0b4b84c4cb5ab1310fa49d301d4d1f69"} Jan 10 16:41:51 crc kubenswrapper[5036]: I0110 16:41:51.118879 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-wz8nd" podUID="c6de1417-c36c-4cbc-863e-15dc71f85afd" containerName="registry-server" probeResult="failure" output=< Jan 10 16:41:51 crc kubenswrapper[5036]: timeout: failed to connect service ":50051" within 1s Jan 10 16:41:51 crc kubenswrapper[5036]: > Jan 10 16:41:51 crc kubenswrapper[5036]: I0110 16:41:51.141416 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbgv7" event={"ID":"68b761b4-18eb-49f6-b9ca-7c9865ab6aba","Type":"ContainerStarted","Data":"61dce9533c95a9ee2e8ab96346a3e96fced5ee75ded207fd4e7099a7eda8d058"} Jan 10 16:41:51 crc kubenswrapper[5036]: I0110 16:41:51.143088 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcjhz" event={"ID":"4ddc3dbc-f7b1-4627-9740-9e2f5c0296fd","Type":"ContainerStarted","Data":"0adb0d9a3b4e59395ba834bb5001cbe1d9a35d4ec47f1d0acb0bd7172756cf72"} Jan 10 16:41:51 crc kubenswrapper[5036]: I0110 16:41:51.177305 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pbgv7" podStartSLOduration=2.66641914 podStartE2EDuration="5.17727316s" podCreationTimestamp="2026-01-10 16:41:46 +0000 UTC" firstStartedPulling="2026-01-10 16:41:48.115023698 +0000 UTC m=+829.985259212" lastFinishedPulling="2026-01-10 16:41:50.625877738 +0000 UTC m=+832.496113232" observedRunningTime="2026-01-10 16:41:51.164507425 +0000 UTC m=+833.034742959" watchObservedRunningTime="2026-01-10 16:41:51.17727316 +0000 UTC m=+833.047508674" Jan 10 16:41:51 crc kubenswrapper[5036]: I0110 16:41:51.202561 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gcjhz" podStartSLOduration=3.27653251 podStartE2EDuration="1m7.202541124s" podCreationTimestamp="2026-01-10 16:40:44 +0000 UTC" firstStartedPulling="2026-01-10 16:40:46.112329927 +0000 UTC m=+767.982565421" lastFinishedPulling="2026-01-10 16:41:50.038338541 +0000 UTC m=+831.908574035" observedRunningTime="2026-01-10 16:41:51.19924134 +0000 UTC m=+833.069476834" watchObservedRunningTime="2026-01-10 16:41:51.202541124 +0000 UTC m=+833.072776618" Jan 10 16:41:55 crc kubenswrapper[5036]: I0110 16:41:55.904724 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 16:41:55 crc kubenswrapper[5036]: I0110 16:41:55.904783 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 16:41:55 crc kubenswrapper[5036]: I0110 16:41:55.904830 5036 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 16:41:55 crc kubenswrapper[5036]: I0110 16:41:55.905507 5036 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ab5f37cb035aad8d11f5d80baed8e115b668e21b971e58b556adfab87217a78"} pod="openshift-machine-config-operator/machine-config-daemon-kqphb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 10 16:41:55 crc kubenswrapper[5036]: I0110 16:41:55.905551 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" containerID="cri-o://5ab5f37cb035aad8d11f5d80baed8e115b668e21b971e58b556adfab87217a78" gracePeriod=600 Jan 10 16:41:56 crc kubenswrapper[5036]: I0110 16:41:56.604861 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pbgv7" Jan 10 16:41:56 crc kubenswrapper[5036]: I0110 16:41:56.605159 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pbgv7" Jan 10 16:41:56 crc kubenswrapper[5036]: I0110 16:41:56.649385 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pbgv7" Jan 10 16:41:57 crc kubenswrapper[5036]: I0110 16:41:57.236959 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pbgv7" Jan 10 16:41:57 crc kubenswrapper[5036]: I0110 16:41:57.295176 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pbgv7"] Jan 10 16:41:58 crc kubenswrapper[5036]: I0110 16:41:58.199144 5036 generic.go:334] "Generic (PLEG): container finished" podID="79756361-741e-4470-831b-6ee092bc6277" containerID="5ab5f37cb035aad8d11f5d80baed8e115b668e21b971e58b556adfab87217a78" exitCode=0 Jan 10 16:41:58 crc kubenswrapper[5036]: I0110 16:41:58.199221 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerDied","Data":"5ab5f37cb035aad8d11f5d80baed8e115b668e21b971e58b556adfab87217a78"} Jan 10 16:41:58 crc kubenswrapper[5036]: I0110 16:41:58.199499 5036 scope.go:117] "RemoveContainer" containerID="20378322ebd3e7842d8359f595988c5cc568fc1291f3096caa536a9fbcf9d4b2" Jan 10 16:41:59 crc kubenswrapper[5036]: I0110 16:41:59.206968 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerStarted","Data":"47b4506ff10880e72e9cad77a434855f34e50bd0e3f4e5d40320d062adfd7136"} Jan 10 16:41:59 crc kubenswrapper[5036]: I0110 16:41:59.207302 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pbgv7" podUID="68b761b4-18eb-49f6-b9ca-7c9865ab6aba" containerName="registry-server" containerID="cri-o://61dce9533c95a9ee2e8ab96346a3e96fced5ee75ded207fd4e7099a7eda8d058" gracePeriod=2 Jan 10 16:41:59 crc kubenswrapper[5036]: I0110 16:41:59.657426 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbgv7" Jan 10 16:41:59 crc kubenswrapper[5036]: I0110 16:41:59.856832 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68b761b4-18eb-49f6-b9ca-7c9865ab6aba-catalog-content\") pod \"68b761b4-18eb-49f6-b9ca-7c9865ab6aba\" (UID: \"68b761b4-18eb-49f6-b9ca-7c9865ab6aba\") " Jan 10 16:41:59 crc kubenswrapper[5036]: I0110 16:41:59.856940 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68b761b4-18eb-49f6-b9ca-7c9865ab6aba-utilities\") pod \"68b761b4-18eb-49f6-b9ca-7c9865ab6aba\" (UID: \"68b761b4-18eb-49f6-b9ca-7c9865ab6aba\") " Jan 10 16:41:59 crc kubenswrapper[5036]: I0110 16:41:59.857016 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gslx\" (UniqueName: \"kubernetes.io/projected/68b761b4-18eb-49f6-b9ca-7c9865ab6aba-kube-api-access-2gslx\") pod \"68b761b4-18eb-49f6-b9ca-7c9865ab6aba\" (UID: \"68b761b4-18eb-49f6-b9ca-7c9865ab6aba\") " Jan 10 16:41:59 crc kubenswrapper[5036]: I0110 16:41:59.858515 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68b761b4-18eb-49f6-b9ca-7c9865ab6aba-utilities" (OuterVolumeSpecName: "utilities") pod "68b761b4-18eb-49f6-b9ca-7c9865ab6aba" (UID: "68b761b4-18eb-49f6-b9ca-7c9865ab6aba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:41:59 crc kubenswrapper[5036]: I0110 16:41:59.865952 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68b761b4-18eb-49f6-b9ca-7c9865ab6aba-kube-api-access-2gslx" (OuterVolumeSpecName: "kube-api-access-2gslx") pod "68b761b4-18eb-49f6-b9ca-7c9865ab6aba" (UID: "68b761b4-18eb-49f6-b9ca-7c9865ab6aba"). InnerVolumeSpecName "kube-api-access-2gslx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:41:59 crc kubenswrapper[5036]: I0110 16:41:59.958460 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68b761b4-18eb-49f6-b9ca-7c9865ab6aba-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 16:41:59 crc kubenswrapper[5036]: I0110 16:41:59.958500 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gslx\" (UniqueName: \"kubernetes.io/projected/68b761b4-18eb-49f6-b9ca-7c9865ab6aba-kube-api-access-2gslx\") on node \"crc\" DevicePath \"\"" Jan 10 16:42:00 crc kubenswrapper[5036]: I0110 16:42:00.140273 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wz8nd" Jan 10 16:42:00 crc kubenswrapper[5036]: I0110 16:42:00.197443 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wz8nd" Jan 10 16:42:00 crc kubenswrapper[5036]: I0110 16:42:00.219234 5036 generic.go:334] "Generic (PLEG): container finished" podID="68b761b4-18eb-49f6-b9ca-7c9865ab6aba" containerID="61dce9533c95a9ee2e8ab96346a3e96fced5ee75ded207fd4e7099a7eda8d058" exitCode=0 Jan 10 16:42:00 crc kubenswrapper[5036]: I0110 16:42:00.219284 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbgv7" event={"ID":"68b761b4-18eb-49f6-b9ca-7c9865ab6aba","Type":"ContainerDied","Data":"61dce9533c95a9ee2e8ab96346a3e96fced5ee75ded207fd4e7099a7eda8d058"} Jan 10 16:42:00 crc kubenswrapper[5036]: I0110 16:42:00.220090 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbgv7" event={"ID":"68b761b4-18eb-49f6-b9ca-7c9865ab6aba","Type":"ContainerDied","Data":"fa398bdadccbbe9ed29a9ff5b6c85a777e8cbe9d810e4bb33e35137f92a7a12d"} Jan 10 16:42:00 crc kubenswrapper[5036]: I0110 16:42:00.219340 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbgv7" Jan 10 16:42:00 crc kubenswrapper[5036]: I0110 16:42:00.220117 5036 scope.go:117] "RemoveContainer" containerID="61dce9533c95a9ee2e8ab96346a3e96fced5ee75ded207fd4e7099a7eda8d058" Jan 10 16:42:00 crc kubenswrapper[5036]: I0110 16:42:00.241503 5036 scope.go:117] "RemoveContainer" containerID="c4c5f933dc7997144a246982ce7f6c1a0b4b84c4cb5ab1310fa49d301d4d1f69" Jan 10 16:42:00 crc kubenswrapper[5036]: I0110 16:42:00.281584 5036 scope.go:117] "RemoveContainer" containerID="fd35aba0352d28f32fbd073e418bd5390e3625e4f59f5452211be3378c02366e" Jan 10 16:42:00 crc kubenswrapper[5036]: I0110 16:42:00.299497 5036 scope.go:117] "RemoveContainer" containerID="61dce9533c95a9ee2e8ab96346a3e96fced5ee75ded207fd4e7099a7eda8d058" Jan 10 16:42:00 crc kubenswrapper[5036]: E0110 16:42:00.299917 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61dce9533c95a9ee2e8ab96346a3e96fced5ee75ded207fd4e7099a7eda8d058\": container with ID starting with 61dce9533c95a9ee2e8ab96346a3e96fced5ee75ded207fd4e7099a7eda8d058 not found: ID does not exist" containerID="61dce9533c95a9ee2e8ab96346a3e96fced5ee75ded207fd4e7099a7eda8d058" Jan 10 16:42:00 crc kubenswrapper[5036]: I0110 16:42:00.300023 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61dce9533c95a9ee2e8ab96346a3e96fced5ee75ded207fd4e7099a7eda8d058"} err="failed to get container status \"61dce9533c95a9ee2e8ab96346a3e96fced5ee75ded207fd4e7099a7eda8d058\": rpc error: code = NotFound desc = could not find container \"61dce9533c95a9ee2e8ab96346a3e96fced5ee75ded207fd4e7099a7eda8d058\": container with ID starting with 61dce9533c95a9ee2e8ab96346a3e96fced5ee75ded207fd4e7099a7eda8d058 not found: ID does not exist" Jan 10 16:42:00 crc kubenswrapper[5036]: I0110 16:42:00.300059 5036 scope.go:117] "RemoveContainer" containerID="c4c5f933dc7997144a246982ce7f6c1a0b4b84c4cb5ab1310fa49d301d4d1f69" Jan 10 16:42:00 crc kubenswrapper[5036]: E0110 16:42:00.300445 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4c5f933dc7997144a246982ce7f6c1a0b4b84c4cb5ab1310fa49d301d4d1f69\": container with ID starting with c4c5f933dc7997144a246982ce7f6c1a0b4b84c4cb5ab1310fa49d301d4d1f69 not found: ID does not exist" containerID="c4c5f933dc7997144a246982ce7f6c1a0b4b84c4cb5ab1310fa49d301d4d1f69" Jan 10 16:42:00 crc kubenswrapper[5036]: I0110 16:42:00.300486 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c5f933dc7997144a246982ce7f6c1a0b4b84c4cb5ab1310fa49d301d4d1f69"} err="failed to get container status \"c4c5f933dc7997144a246982ce7f6c1a0b4b84c4cb5ab1310fa49d301d4d1f69\": rpc error: code = NotFound desc = could not find container \"c4c5f933dc7997144a246982ce7f6c1a0b4b84c4cb5ab1310fa49d301d4d1f69\": container with ID starting with c4c5f933dc7997144a246982ce7f6c1a0b4b84c4cb5ab1310fa49d301d4d1f69 not found: ID does not exist" Jan 10 16:42:00 crc kubenswrapper[5036]: I0110 16:42:00.300525 5036 scope.go:117] "RemoveContainer" containerID="fd35aba0352d28f32fbd073e418bd5390e3625e4f59f5452211be3378c02366e" Jan 10 16:42:00 crc kubenswrapper[5036]: E0110 16:42:00.300994 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd35aba0352d28f32fbd073e418bd5390e3625e4f59f5452211be3378c02366e\": container with ID starting with fd35aba0352d28f32fbd073e418bd5390e3625e4f59f5452211be3378c02366e not found: ID does not exist" containerID="fd35aba0352d28f32fbd073e418bd5390e3625e4f59f5452211be3378c02366e" Jan 10 16:42:00 crc kubenswrapper[5036]: I0110 16:42:00.301023 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd35aba0352d28f32fbd073e418bd5390e3625e4f59f5452211be3378c02366e"} err="failed to get container status \"fd35aba0352d28f32fbd073e418bd5390e3625e4f59f5452211be3378c02366e\": rpc error: code = NotFound desc = could not find container \"fd35aba0352d28f32fbd073e418bd5390e3625e4f59f5452211be3378c02366e\": container with ID starting with fd35aba0352d28f32fbd073e418bd5390e3625e4f59f5452211be3378c02366e not found: ID does not exist" Jan 10 16:42:00 crc kubenswrapper[5036]: I0110 16:42:00.421809 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68b761b4-18eb-49f6-b9ca-7c9865ab6aba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68b761b4-18eb-49f6-b9ca-7c9865ab6aba" (UID: "68b761b4-18eb-49f6-b9ca-7c9865ab6aba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:42:00 crc kubenswrapper[5036]: I0110 16:42:00.467906 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68b761b4-18eb-49f6-b9ca-7c9865ab6aba-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 16:42:00 crc kubenswrapper[5036]: I0110 16:42:00.545542 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pbgv7"] Jan 10 16:42:00 crc kubenswrapper[5036]: I0110 16:42:00.549205 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pbgv7"] Jan 10 16:42:01 crc kubenswrapper[5036]: I0110 16:42:01.883082 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wz8nd"] Jan 10 16:42:01 crc kubenswrapper[5036]: I0110 16:42:01.883589 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wz8nd" podUID="c6de1417-c36c-4cbc-863e-15dc71f85afd" containerName="registry-server" containerID="cri-o://711eedc99aba949697e7220482e452167782186d94ae22acb2d0133f45c1964f" gracePeriod=2 Jan 10 16:42:02 crc kubenswrapper[5036]: I0110 16:42:02.236498 5036 generic.go:334] "Generic (PLEG): container finished" podID="c6de1417-c36c-4cbc-863e-15dc71f85afd" containerID="711eedc99aba949697e7220482e452167782186d94ae22acb2d0133f45c1964f" exitCode=0 Jan 10 16:42:02 crc kubenswrapper[5036]: I0110 16:42:02.236531 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wz8nd" event={"ID":"c6de1417-c36c-4cbc-863e-15dc71f85afd","Type":"ContainerDied","Data":"711eedc99aba949697e7220482e452167782186d94ae22acb2d0133f45c1964f"} Jan 10 16:42:02 crc kubenswrapper[5036]: I0110 16:42:02.397338 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wz8nd" Jan 10 16:42:02 crc kubenswrapper[5036]: I0110 16:42:02.497908 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6de1417-c36c-4cbc-863e-15dc71f85afd-utilities\") pod \"c6de1417-c36c-4cbc-863e-15dc71f85afd\" (UID: \"c6de1417-c36c-4cbc-863e-15dc71f85afd\") " Jan 10 16:42:02 crc kubenswrapper[5036]: I0110 16:42:02.497962 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp47r\" (UniqueName: \"kubernetes.io/projected/c6de1417-c36c-4cbc-863e-15dc71f85afd-kube-api-access-qp47r\") pod \"c6de1417-c36c-4cbc-863e-15dc71f85afd\" (UID: \"c6de1417-c36c-4cbc-863e-15dc71f85afd\") " Jan 10 16:42:02 crc kubenswrapper[5036]: I0110 16:42:02.497991 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6de1417-c36c-4cbc-863e-15dc71f85afd-catalog-content\") pod \"c6de1417-c36c-4cbc-863e-15dc71f85afd\" (UID: \"c6de1417-c36c-4cbc-863e-15dc71f85afd\") " Jan 10 16:42:02 crc kubenswrapper[5036]: I0110 16:42:02.498999 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6de1417-c36c-4cbc-863e-15dc71f85afd-utilities" (OuterVolumeSpecName: "utilities") pod "c6de1417-c36c-4cbc-863e-15dc71f85afd" (UID: "c6de1417-c36c-4cbc-863e-15dc71f85afd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:42:02 crc kubenswrapper[5036]: I0110 16:42:02.506648 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6de1417-c36c-4cbc-863e-15dc71f85afd-kube-api-access-qp47r" (OuterVolumeSpecName: "kube-api-access-qp47r") pod "c6de1417-c36c-4cbc-863e-15dc71f85afd" (UID: "c6de1417-c36c-4cbc-863e-15dc71f85afd"). InnerVolumeSpecName "kube-api-access-qp47r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:42:02 crc kubenswrapper[5036]: I0110 16:42:02.518150 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68b761b4-18eb-49f6-b9ca-7c9865ab6aba" path="/var/lib/kubelet/pods/68b761b4-18eb-49f6-b9ca-7c9865ab6aba/volumes" Jan 10 16:42:02 crc kubenswrapper[5036]: I0110 16:42:02.522928 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6de1417-c36c-4cbc-863e-15dc71f85afd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6de1417-c36c-4cbc-863e-15dc71f85afd" (UID: "c6de1417-c36c-4cbc-863e-15dc71f85afd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:42:02 crc kubenswrapper[5036]: I0110 16:42:02.599839 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6de1417-c36c-4cbc-863e-15dc71f85afd-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 16:42:02 crc kubenswrapper[5036]: I0110 16:42:02.599879 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp47r\" (UniqueName: \"kubernetes.io/projected/c6de1417-c36c-4cbc-863e-15dc71f85afd-kube-api-access-qp47r\") on node \"crc\" DevicePath \"\"" Jan 10 16:42:02 crc kubenswrapper[5036]: I0110 16:42:02.599893 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6de1417-c36c-4cbc-863e-15dc71f85afd-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 16:42:03 crc kubenswrapper[5036]: I0110 16:42:03.247279 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wz8nd" event={"ID":"c6de1417-c36c-4cbc-863e-15dc71f85afd","Type":"ContainerDied","Data":"52591e78ff6ddb4ab76d131da33eaef86a70c317f915671285892af1164b6145"} Jan 10 16:42:03 crc kubenswrapper[5036]: I0110 16:42:03.247322 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wz8nd" Jan 10 16:42:03 crc kubenswrapper[5036]: I0110 16:42:03.247617 5036 scope.go:117] "RemoveContainer" containerID="711eedc99aba949697e7220482e452167782186d94ae22acb2d0133f45c1964f" Jan 10 16:42:03 crc kubenswrapper[5036]: I0110 16:42:03.267144 5036 scope.go:117] "RemoveContainer" containerID="8aef4ab7455f8de0daed903127095a521547ead70c076a586ffa64eb34136e4e" Jan 10 16:42:03 crc kubenswrapper[5036]: I0110 16:42:03.290976 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wz8nd"] Jan 10 16:42:03 crc kubenswrapper[5036]: I0110 16:42:03.300343 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wz8nd"] Jan 10 16:42:03 crc kubenswrapper[5036]: I0110 16:42:03.306948 5036 scope.go:117] "RemoveContainer" containerID="b34fdda6a0e3d077205f7ae618f09169ddd1838ecd4abe7da4b7664d59e618cb" Jan 10 16:42:04 crc kubenswrapper[5036]: I0110 16:42:04.519085 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6de1417-c36c-4cbc-863e-15dc71f85afd" path="/var/lib/kubelet/pods/c6de1417-c36c-4cbc-863e-15dc71f85afd/volumes" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.405704 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pssbt"] Jan 10 16:42:08 crc kubenswrapper[5036]: E0110 16:42:08.406511 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74a5a6c-2ffd-433c-865b-2c81a65485a6" containerName="extract-content" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.406526 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74a5a6c-2ffd-433c-865b-2c81a65485a6" containerName="extract-content" Jan 10 16:42:08 crc kubenswrapper[5036]: E0110 16:42:08.406542 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74a5a6c-2ffd-433c-865b-2c81a65485a6" containerName="extract-utilities" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.406550 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74a5a6c-2ffd-433c-865b-2c81a65485a6" containerName="extract-utilities" Jan 10 16:42:08 crc kubenswrapper[5036]: E0110 16:42:08.406567 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6de1417-c36c-4cbc-863e-15dc71f85afd" containerName="extract-content" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.406575 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6de1417-c36c-4cbc-863e-15dc71f85afd" containerName="extract-content" Jan 10 16:42:08 crc kubenswrapper[5036]: E0110 16:42:08.406588 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68b761b4-18eb-49f6-b9ca-7c9865ab6aba" containerName="registry-server" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.406594 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b761b4-18eb-49f6-b9ca-7c9865ab6aba" containerName="registry-server" Jan 10 16:42:08 crc kubenswrapper[5036]: E0110 16:42:08.406605 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6de1417-c36c-4cbc-863e-15dc71f85afd" containerName="registry-server" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.406610 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6de1417-c36c-4cbc-863e-15dc71f85afd" containerName="registry-server" Jan 10 16:42:08 crc kubenswrapper[5036]: E0110 16:42:08.406619 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68b761b4-18eb-49f6-b9ca-7c9865ab6aba" containerName="extract-content" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.406624 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b761b4-18eb-49f6-b9ca-7c9865ab6aba" containerName="extract-content" Jan 10 16:42:08 crc kubenswrapper[5036]: E0110 16:42:08.406631 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74a5a6c-2ffd-433c-865b-2c81a65485a6" containerName="registry-server" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.406636 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74a5a6c-2ffd-433c-865b-2c81a65485a6" containerName="registry-server" Jan 10 16:42:08 crc kubenswrapper[5036]: E0110 16:42:08.406643 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6de1417-c36c-4cbc-863e-15dc71f85afd" containerName="extract-utilities" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.406649 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6de1417-c36c-4cbc-863e-15dc71f85afd" containerName="extract-utilities" Jan 10 16:42:08 crc kubenswrapper[5036]: E0110 16:42:08.406661 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68b761b4-18eb-49f6-b9ca-7c9865ab6aba" containerName="extract-utilities" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.406666 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b761b4-18eb-49f6-b9ca-7c9865ab6aba" containerName="extract-utilities" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.406809 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="f74a5a6c-2ffd-433c-865b-2c81a65485a6" containerName="registry-server" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.406823 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="68b761b4-18eb-49f6-b9ca-7c9865ab6aba" containerName="registry-server" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.406834 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6de1417-c36c-4cbc-863e-15dc71f85afd" containerName="registry-server" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.408548 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-pssbt" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.412753 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.412820 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.412865 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-kbbcj" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.416659 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pssbt"] Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.440198 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.489247 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxjzg\" (UniqueName: \"kubernetes.io/projected/4c4841ee-0a5e-40b3-8485-6c58a0229301-kube-api-access-wxjzg\") pod \"dnsmasq-dns-675f4bcbfc-pssbt\" (UID: \"4c4841ee-0a5e-40b3-8485-6c58a0229301\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pssbt" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.489376 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c4841ee-0a5e-40b3-8485-6c58a0229301-config\") pod \"dnsmasq-dns-675f4bcbfc-pssbt\" (UID: \"4c4841ee-0a5e-40b3-8485-6c58a0229301\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pssbt" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.565418 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-49slj"] Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.574136 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-49slj" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.577607 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.587129 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-49slj"] Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.591252 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9h7n\" (UniqueName: \"kubernetes.io/projected/cbdbb880-4817-4aa5-88a8-ba637b9ea219-kube-api-access-k9h7n\") pod \"dnsmasq-dns-78dd6ddcc-49slj\" (UID: \"cbdbb880-4817-4aa5-88a8-ba637b9ea219\") " pod="openstack/dnsmasq-dns-78dd6ddcc-49slj" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.591404 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbdbb880-4817-4aa5-88a8-ba637b9ea219-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-49slj\" (UID: \"cbdbb880-4817-4aa5-88a8-ba637b9ea219\") " pod="openstack/dnsmasq-dns-78dd6ddcc-49slj" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.591510 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbdbb880-4817-4aa5-88a8-ba637b9ea219-config\") pod \"dnsmasq-dns-78dd6ddcc-49slj\" (UID: \"cbdbb880-4817-4aa5-88a8-ba637b9ea219\") " pod="openstack/dnsmasq-dns-78dd6ddcc-49slj" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.591590 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c4841ee-0a5e-40b3-8485-6c58a0229301-config\") pod \"dnsmasq-dns-675f4bcbfc-pssbt\" (UID: \"4c4841ee-0a5e-40b3-8485-6c58a0229301\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pssbt" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.591672 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxjzg\" (UniqueName: \"kubernetes.io/projected/4c4841ee-0a5e-40b3-8485-6c58a0229301-kube-api-access-wxjzg\") pod \"dnsmasq-dns-675f4bcbfc-pssbt\" (UID: \"4c4841ee-0a5e-40b3-8485-6c58a0229301\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pssbt" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.592852 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c4841ee-0a5e-40b3-8485-6c58a0229301-config\") pod \"dnsmasq-dns-675f4bcbfc-pssbt\" (UID: \"4c4841ee-0a5e-40b3-8485-6c58a0229301\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pssbt" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.648728 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxjzg\" (UniqueName: \"kubernetes.io/projected/4c4841ee-0a5e-40b3-8485-6c58a0229301-kube-api-access-wxjzg\") pod \"dnsmasq-dns-675f4bcbfc-pssbt\" (UID: \"4c4841ee-0a5e-40b3-8485-6c58a0229301\") " pod="openstack/dnsmasq-dns-675f4bcbfc-pssbt" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.693396 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9h7n\" (UniqueName: \"kubernetes.io/projected/cbdbb880-4817-4aa5-88a8-ba637b9ea219-kube-api-access-k9h7n\") pod \"dnsmasq-dns-78dd6ddcc-49slj\" (UID: \"cbdbb880-4817-4aa5-88a8-ba637b9ea219\") " pod="openstack/dnsmasq-dns-78dd6ddcc-49slj" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.693491 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbdbb880-4817-4aa5-88a8-ba637b9ea219-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-49slj\" (UID: \"cbdbb880-4817-4aa5-88a8-ba637b9ea219\") " pod="openstack/dnsmasq-dns-78dd6ddcc-49slj" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.693585 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbdbb880-4817-4aa5-88a8-ba637b9ea219-config\") pod \"dnsmasq-dns-78dd6ddcc-49slj\" (UID: \"cbdbb880-4817-4aa5-88a8-ba637b9ea219\") " pod="openstack/dnsmasq-dns-78dd6ddcc-49slj" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.694480 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbdbb880-4817-4aa5-88a8-ba637b9ea219-config\") pod \"dnsmasq-dns-78dd6ddcc-49slj\" (UID: \"cbdbb880-4817-4aa5-88a8-ba637b9ea219\") " pod="openstack/dnsmasq-dns-78dd6ddcc-49slj" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.695384 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbdbb880-4817-4aa5-88a8-ba637b9ea219-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-49slj\" (UID: \"cbdbb880-4817-4aa5-88a8-ba637b9ea219\") " pod="openstack/dnsmasq-dns-78dd6ddcc-49slj" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.714961 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9h7n\" (UniqueName: \"kubernetes.io/projected/cbdbb880-4817-4aa5-88a8-ba637b9ea219-kube-api-access-k9h7n\") pod \"dnsmasq-dns-78dd6ddcc-49slj\" (UID: \"cbdbb880-4817-4aa5-88a8-ba637b9ea219\") " pod="openstack/dnsmasq-dns-78dd6ddcc-49slj" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.741768 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-pssbt" Jan 10 16:42:08 crc kubenswrapper[5036]: I0110 16:42:08.894712 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-49slj" Jan 10 16:42:09 crc kubenswrapper[5036]: I0110 16:42:09.154630 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pssbt"] Jan 10 16:42:09 crc kubenswrapper[5036]: I0110 16:42:09.286528 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-pssbt" event={"ID":"4c4841ee-0a5e-40b3-8485-6c58a0229301","Type":"ContainerStarted","Data":"3f36a01bc98e1cd313e01c4f2f69a67e8936b21968851cc23b32e8bb5e4ac6e0"} Jan 10 16:42:09 crc kubenswrapper[5036]: I0110 16:42:09.322325 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-49slj"] Jan 10 16:42:09 crc kubenswrapper[5036]: W0110 16:42:09.327488 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbdbb880_4817_4aa5_88a8_ba637b9ea219.slice/crio-095acce0d5ae9f5d845945fbf76ed367326e5013c21de334874b41a7fc385add WatchSource:0}: Error finding container 095acce0d5ae9f5d845945fbf76ed367326e5013c21de334874b41a7fc385add: Status 404 returned error can't find the container with id 095acce0d5ae9f5d845945fbf76ed367326e5013c21de334874b41a7fc385add Jan 10 16:42:09 crc kubenswrapper[5036]: I0110 16:42:09.466998 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-65vvl"] Jan 10 16:42:09 crc kubenswrapper[5036]: I0110 16:42:09.469166 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-65vvl" Jan 10 16:42:09 crc kubenswrapper[5036]: I0110 16:42:09.484305 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-65vvl"] Jan 10 16:42:09 crc kubenswrapper[5036]: I0110 16:42:09.508237 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b614213-d497-4882-9055-dc68dd058b01-catalog-content\") pod \"community-operators-65vvl\" (UID: \"6b614213-d497-4882-9055-dc68dd058b01\") " pod="openshift-marketplace/community-operators-65vvl" Jan 10 16:42:09 crc kubenswrapper[5036]: I0110 16:42:09.508273 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b614213-d497-4882-9055-dc68dd058b01-utilities\") pod \"community-operators-65vvl\" (UID: \"6b614213-d497-4882-9055-dc68dd058b01\") " pod="openshift-marketplace/community-operators-65vvl" Jan 10 16:42:09 crc kubenswrapper[5036]: I0110 16:42:09.508559 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs4bl\" (UniqueName: \"kubernetes.io/projected/6b614213-d497-4882-9055-dc68dd058b01-kube-api-access-fs4bl\") pod \"community-operators-65vvl\" (UID: \"6b614213-d497-4882-9055-dc68dd058b01\") " pod="openshift-marketplace/community-operators-65vvl" Jan 10 16:42:09 crc kubenswrapper[5036]: I0110 16:42:09.609753 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs4bl\" (UniqueName: \"kubernetes.io/projected/6b614213-d497-4882-9055-dc68dd058b01-kube-api-access-fs4bl\") pod \"community-operators-65vvl\" (UID: \"6b614213-d497-4882-9055-dc68dd058b01\") " pod="openshift-marketplace/community-operators-65vvl" Jan 10 16:42:09 crc kubenswrapper[5036]: I0110 16:42:09.610069 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b614213-d497-4882-9055-dc68dd058b01-catalog-content\") pod \"community-operators-65vvl\" (UID: \"6b614213-d497-4882-9055-dc68dd058b01\") " pod="openshift-marketplace/community-operators-65vvl" Jan 10 16:42:09 crc kubenswrapper[5036]: I0110 16:42:09.610225 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b614213-d497-4882-9055-dc68dd058b01-utilities\") pod \"community-operators-65vvl\" (UID: \"6b614213-d497-4882-9055-dc68dd058b01\") " pod="openshift-marketplace/community-operators-65vvl" Jan 10 16:42:09 crc kubenswrapper[5036]: I0110 16:42:09.610770 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b614213-d497-4882-9055-dc68dd058b01-catalog-content\") pod \"community-operators-65vvl\" (UID: \"6b614213-d497-4882-9055-dc68dd058b01\") " pod="openshift-marketplace/community-operators-65vvl" Jan 10 16:42:09 crc kubenswrapper[5036]: I0110 16:42:09.610781 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b614213-d497-4882-9055-dc68dd058b01-utilities\") pod \"community-operators-65vvl\" (UID: \"6b614213-d497-4882-9055-dc68dd058b01\") " pod="openshift-marketplace/community-operators-65vvl" Jan 10 16:42:09 crc kubenswrapper[5036]: I0110 16:42:09.631818 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs4bl\" (UniqueName: \"kubernetes.io/projected/6b614213-d497-4882-9055-dc68dd058b01-kube-api-access-fs4bl\") pod \"community-operators-65vvl\" (UID: \"6b614213-d497-4882-9055-dc68dd058b01\") " pod="openshift-marketplace/community-operators-65vvl" Jan 10 16:42:09 crc kubenswrapper[5036]: I0110 16:42:09.794478 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-65vvl" Jan 10 16:42:10 crc kubenswrapper[5036]: I0110 16:42:10.300828 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-65vvl"] Jan 10 16:42:10 crc kubenswrapper[5036]: I0110 16:42:10.301367 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-49slj" event={"ID":"cbdbb880-4817-4aa5-88a8-ba637b9ea219","Type":"ContainerStarted","Data":"095acce0d5ae9f5d845945fbf76ed367326e5013c21de334874b41a7fc385add"} Jan 10 16:42:10 crc kubenswrapper[5036]: W0110 16:42:10.309795 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b614213_d497_4882_9055_dc68dd058b01.slice/crio-4fcd53dfe451d35e1d5ef005c8ddda1c71d153be6e0ac67d5594304f68e8b9f2 WatchSource:0}: Error finding container 4fcd53dfe451d35e1d5ef005c8ddda1c71d153be6e0ac67d5594304f68e8b9f2: Status 404 returned error can't find the container with id 4fcd53dfe451d35e1d5ef005c8ddda1c71d153be6e0ac67d5594304f68e8b9f2 Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.191804 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pssbt"] Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.227203 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vccnd"] Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.228322 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-vccnd" Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.275832 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzrlh\" (UniqueName: \"kubernetes.io/projected/7b4b1dc6-7568-41aa-926b-014a274ef1d6-kube-api-access-pzrlh\") pod \"dnsmasq-dns-666b6646f7-vccnd\" (UID: \"7b4b1dc6-7568-41aa-926b-014a274ef1d6\") " pod="openstack/dnsmasq-dns-666b6646f7-vccnd" Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.276652 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b4b1dc6-7568-41aa-926b-014a274ef1d6-dns-svc\") pod \"dnsmasq-dns-666b6646f7-vccnd\" (UID: \"7b4b1dc6-7568-41aa-926b-014a274ef1d6\") " pod="openstack/dnsmasq-dns-666b6646f7-vccnd" Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.277084 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b4b1dc6-7568-41aa-926b-014a274ef1d6-config\") pod \"dnsmasq-dns-666b6646f7-vccnd\" (UID: \"7b4b1dc6-7568-41aa-926b-014a274ef1d6\") " pod="openstack/dnsmasq-dns-666b6646f7-vccnd" Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.279562 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vccnd"] Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.364618 5036 generic.go:334] "Generic (PLEG): container finished" podID="6b614213-d497-4882-9055-dc68dd058b01" containerID="9232ea53f4be74d0156a8f1a8db28ec9ed79e41a520588a7e3a111dbbac6b8bf" exitCode=0 Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.364713 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65vvl" event={"ID":"6b614213-d497-4882-9055-dc68dd058b01","Type":"ContainerDied","Data":"9232ea53f4be74d0156a8f1a8db28ec9ed79e41a520588a7e3a111dbbac6b8bf"} Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.364752 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65vvl" event={"ID":"6b614213-d497-4882-9055-dc68dd058b01","Type":"ContainerStarted","Data":"4fcd53dfe451d35e1d5ef005c8ddda1c71d153be6e0ac67d5594304f68e8b9f2"} Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.385275 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b4b1dc6-7568-41aa-926b-014a274ef1d6-dns-svc\") pod \"dnsmasq-dns-666b6646f7-vccnd\" (UID: \"7b4b1dc6-7568-41aa-926b-014a274ef1d6\") " pod="openstack/dnsmasq-dns-666b6646f7-vccnd" Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.385393 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b4b1dc6-7568-41aa-926b-014a274ef1d6-config\") pod \"dnsmasq-dns-666b6646f7-vccnd\" (UID: \"7b4b1dc6-7568-41aa-926b-014a274ef1d6\") " pod="openstack/dnsmasq-dns-666b6646f7-vccnd" Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.385466 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzrlh\" (UniqueName: \"kubernetes.io/projected/7b4b1dc6-7568-41aa-926b-014a274ef1d6-kube-api-access-pzrlh\") pod \"dnsmasq-dns-666b6646f7-vccnd\" (UID: \"7b4b1dc6-7568-41aa-926b-014a274ef1d6\") " pod="openstack/dnsmasq-dns-666b6646f7-vccnd" Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.386343 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b4b1dc6-7568-41aa-926b-014a274ef1d6-dns-svc\") pod \"dnsmasq-dns-666b6646f7-vccnd\" (UID: \"7b4b1dc6-7568-41aa-926b-014a274ef1d6\") " pod="openstack/dnsmasq-dns-666b6646f7-vccnd" Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.386809 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b4b1dc6-7568-41aa-926b-014a274ef1d6-config\") pod \"dnsmasq-dns-666b6646f7-vccnd\" (UID: \"7b4b1dc6-7568-41aa-926b-014a274ef1d6\") " pod="openstack/dnsmasq-dns-666b6646f7-vccnd" Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.424919 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzrlh\" (UniqueName: \"kubernetes.io/projected/7b4b1dc6-7568-41aa-926b-014a274ef1d6-kube-api-access-pzrlh\") pod \"dnsmasq-dns-666b6646f7-vccnd\" (UID: \"7b4b1dc6-7568-41aa-926b-014a274ef1d6\") " pod="openstack/dnsmasq-dns-666b6646f7-vccnd" Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.575977 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-vccnd" Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.651015 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-49slj"] Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.683533 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vhkcd"] Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.684583 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vhkcd" Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.767906 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vhkcd"] Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.791841 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4a7f810-b3cd-4699-9ac4-b09e68779b5f-config\") pod \"dnsmasq-dns-57d769cc4f-vhkcd\" (UID: \"f4a7f810-b3cd-4699-9ac4-b09e68779b5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-vhkcd" Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.791877 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqlw4\" (UniqueName: \"kubernetes.io/projected/f4a7f810-b3cd-4699-9ac4-b09e68779b5f-kube-api-access-wqlw4\") pod \"dnsmasq-dns-57d769cc4f-vhkcd\" (UID: \"f4a7f810-b3cd-4699-9ac4-b09e68779b5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-vhkcd" Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.791935 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4a7f810-b3cd-4699-9ac4-b09e68779b5f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-vhkcd\" (UID: \"f4a7f810-b3cd-4699-9ac4-b09e68779b5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-vhkcd" Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.894560 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4a7f810-b3cd-4699-9ac4-b09e68779b5f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-vhkcd\" (UID: \"f4a7f810-b3cd-4699-9ac4-b09e68779b5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-vhkcd" Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.894688 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4a7f810-b3cd-4699-9ac4-b09e68779b5f-config\") pod \"dnsmasq-dns-57d769cc4f-vhkcd\" (UID: \"f4a7f810-b3cd-4699-9ac4-b09e68779b5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-vhkcd" Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.894722 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqlw4\" (UniqueName: \"kubernetes.io/projected/f4a7f810-b3cd-4699-9ac4-b09e68779b5f-kube-api-access-wqlw4\") pod \"dnsmasq-dns-57d769cc4f-vhkcd\" (UID: \"f4a7f810-b3cd-4699-9ac4-b09e68779b5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-vhkcd" Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.896397 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4a7f810-b3cd-4699-9ac4-b09e68779b5f-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-vhkcd\" (UID: \"f4a7f810-b3cd-4699-9ac4-b09e68779b5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-vhkcd" Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.897015 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4a7f810-b3cd-4699-9ac4-b09e68779b5f-config\") pod \"dnsmasq-dns-57d769cc4f-vhkcd\" (UID: \"f4a7f810-b3cd-4699-9ac4-b09e68779b5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-vhkcd" Jan 10 16:42:11 crc kubenswrapper[5036]: I0110 16:42:11.917915 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqlw4\" (UniqueName: \"kubernetes.io/projected/f4a7f810-b3cd-4699-9ac4-b09e68779b5f-kube-api-access-wqlw4\") pod \"dnsmasq-dns-57d769cc4f-vhkcd\" (UID: \"f4a7f810-b3cd-4699-9ac4-b09e68779b5f\") " pod="openstack/dnsmasq-dns-57d769cc4f-vhkcd" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.078745 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vhkcd" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.249136 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vccnd"] Jan 10 16:42:12 crc kubenswrapper[5036]: W0110 16:42:12.273095 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b4b1dc6_7568_41aa_926b_014a274ef1d6.slice/crio-ea1ead3cf516e90ec3250232e4dc6a2ae2feb7fb5fc8fca1c0a31ced1f06ef5a WatchSource:0}: Error finding container ea1ead3cf516e90ec3250232e4dc6a2ae2feb7fb5fc8fca1c0a31ced1f06ef5a: Status 404 returned error can't find the container with id ea1ead3cf516e90ec3250232e4dc6a2ae2feb7fb5fc8fca1c0a31ced1f06ef5a Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.377989 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-vccnd" event={"ID":"7b4b1dc6-7568-41aa-926b-014a274ef1d6","Type":"ContainerStarted","Data":"ea1ead3cf516e90ec3250232e4dc6a2ae2feb7fb5fc8fca1c0a31ced1f06ef5a"} Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.403488 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.405573 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.408564 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.408722 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.408855 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-b7v4x" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.408860 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.408722 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.409039 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.409227 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.421650 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.542607 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vhkcd"] Jan 10 16:42:12 crc kubenswrapper[5036]: W0110 16:42:12.570726 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4a7f810_b3cd_4699_9ac4_b09e68779b5f.slice/crio-a46185c465a60e891364c9b4fdd46a4c78e127130157686236bedf735eb337af WatchSource:0}: Error finding container a46185c465a60e891364c9b4fdd46a4c78e127130157686236bedf735eb337af: Status 404 returned error can't find the container with id a46185c465a60e891364c9b4fdd46a4c78e127130157686236bedf735eb337af Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.606610 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cd708bfb-a557-401f-b815-16d584c8eb78-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.606658 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cd708bfb-a557-401f-b815-16d584c8eb78-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.606775 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cd708bfb-a557-401f-b815-16d584c8eb78-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.606836 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd708bfb-a557-401f-b815-16d584c8eb78-config-data\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.606855 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cd708bfb-a557-401f-b815-16d584c8eb78-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.606921 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cd708bfb-a557-401f-b815-16d584c8eb78-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.606958 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cd708bfb-a557-401f-b815-16d584c8eb78-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.606988 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cd708bfb-a557-401f-b815-16d584c8eb78-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.607008 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cd708bfb-a557-401f-b815-16d584c8eb78-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.607053 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.607120 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnwrx\" (UniqueName: \"kubernetes.io/projected/cd708bfb-a557-401f-b815-16d584c8eb78-kube-api-access-rnwrx\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.709106 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.709193 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnwrx\" (UniqueName: \"kubernetes.io/projected/cd708bfb-a557-401f-b815-16d584c8eb78-kube-api-access-rnwrx\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.709225 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cd708bfb-a557-401f-b815-16d584c8eb78-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.709260 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cd708bfb-a557-401f-b815-16d584c8eb78-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.709286 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cd708bfb-a557-401f-b815-16d584c8eb78-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.709326 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd708bfb-a557-401f-b815-16d584c8eb78-config-data\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.709348 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cd708bfb-a557-401f-b815-16d584c8eb78-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.709366 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cd708bfb-a557-401f-b815-16d584c8eb78-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.709391 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cd708bfb-a557-401f-b815-16d584c8eb78-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.709412 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cd708bfb-a557-401f-b815-16d584c8eb78-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.709435 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cd708bfb-a557-401f-b815-16d584c8eb78-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.710313 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd708bfb-a557-401f-b815-16d584c8eb78-config-data\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.710525 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cd708bfb-a557-401f-b815-16d584c8eb78-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.710629 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cd708bfb-a557-401f-b815-16d584c8eb78-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.710674 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cd708bfb-a557-401f-b815-16d584c8eb78-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.711739 5036 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.712698 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cd708bfb-a557-401f-b815-16d584c8eb78-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.718526 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cd708bfb-a557-401f-b815-16d584c8eb78-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.718958 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cd708bfb-a557-401f-b815-16d584c8eb78-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.723433 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cd708bfb-a557-401f-b815-16d584c8eb78-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.729284 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cd708bfb-a557-401f-b815-16d584c8eb78-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.733809 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnwrx\" (UniqueName: \"kubernetes.io/projected/cd708bfb-a557-401f-b815-16d584c8eb78-kube-api-access-rnwrx\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.746752 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.769044 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.826326 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.827430 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.830185 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.830361 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.830476 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.830577 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.830801 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-pdt5c" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.830933 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.831875 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 10 16:42:12 crc kubenswrapper[5036]: I0110 16:42:12.844088 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.016638 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8146d758-62d6-4640-86f8-51b89a8a8519-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.016896 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8146d758-62d6-4640-86f8-51b89a8a8519-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.016967 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8146d758-62d6-4640-86f8-51b89a8a8519-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.017044 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8146d758-62d6-4640-86f8-51b89a8a8519-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.017139 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8146d758-62d6-4640-86f8-51b89a8a8519-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.017238 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8146d758-62d6-4640-86f8-51b89a8a8519-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.017301 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8146d758-62d6-4640-86f8-51b89a8a8519-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.017371 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8146d758-62d6-4640-86f8-51b89a8a8519-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.017433 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.017506 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mllbt\" (UniqueName: \"kubernetes.io/projected/8146d758-62d6-4640-86f8-51b89a8a8519-kube-api-access-mllbt\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.017581 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8146d758-62d6-4640-86f8-51b89a8a8519-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.119189 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8146d758-62d6-4640-86f8-51b89a8a8519-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.119263 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8146d758-62d6-4640-86f8-51b89a8a8519-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.119285 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8146d758-62d6-4640-86f8-51b89a8a8519-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.119310 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8146d758-62d6-4640-86f8-51b89a8a8519-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.119329 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.119350 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mllbt\" (UniqueName: \"kubernetes.io/projected/8146d758-62d6-4640-86f8-51b89a8a8519-kube-api-access-mllbt\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.119376 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8146d758-62d6-4640-86f8-51b89a8a8519-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.119403 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8146d758-62d6-4640-86f8-51b89a8a8519-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.119427 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8146d758-62d6-4640-86f8-51b89a8a8519-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.119443 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8146d758-62d6-4640-86f8-51b89a8a8519-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.119471 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8146d758-62d6-4640-86f8-51b89a8a8519-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.119982 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8146d758-62d6-4640-86f8-51b89a8a8519-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.120088 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8146d758-62d6-4640-86f8-51b89a8a8519-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.120453 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8146d758-62d6-4640-86f8-51b89a8a8519-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.120527 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8146d758-62d6-4640-86f8-51b89a8a8519-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.120528 5036 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.123771 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8146d758-62d6-4640-86f8-51b89a8a8519-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.124519 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8146d758-62d6-4640-86f8-51b89a8a8519-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.129375 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8146d758-62d6-4640-86f8-51b89a8a8519-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.129479 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8146d758-62d6-4640-86f8-51b89a8a8519-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.129626 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8146d758-62d6-4640-86f8-51b89a8a8519-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.134733 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mllbt\" (UniqueName: \"kubernetes.io/projected/8146d758-62d6-4640-86f8-51b89a8a8519-kube-api-access-mllbt\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.150288 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.201869 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:42:13 crc kubenswrapper[5036]: I0110 16:42:13.388077 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vhkcd" event={"ID":"f4a7f810-b3cd-4699-9ac4-b09e68779b5f","Type":"ContainerStarted","Data":"a46185c465a60e891364c9b4fdd46a4c78e127130157686236bedf735eb337af"} Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.008426 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.011415 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.015757 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-m2n7z" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.015945 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.017355 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.017811 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.020825 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.028631 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.149397 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f624572-bbfe-4c9d-be6f-f8f647fd8aa2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") " pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.149472 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3f624572-bbfe-4c9d-be6f-f8f647fd8aa2-kolla-config\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") " pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.149522 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f624572-bbfe-4c9d-be6f-f8f647fd8aa2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") " pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.149595 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f624572-bbfe-4c9d-be6f-f8f647fd8aa2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") " pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.149634 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvqsc\" (UniqueName: \"kubernetes.io/projected/3f624572-bbfe-4c9d-be6f-f8f647fd8aa2-kube-api-access-qvqsc\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") " pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.149666 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") " pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.149709 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3f624572-bbfe-4c9d-be6f-f8f647fd8aa2-config-data-default\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") " pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.149746 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3f624572-bbfe-4c9d-be6f-f8f647fd8aa2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") " pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.251386 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f624572-bbfe-4c9d-be6f-f8f647fd8aa2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") " pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.251444 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvqsc\" (UniqueName: \"kubernetes.io/projected/3f624572-bbfe-4c9d-be6f-f8f647fd8aa2-kube-api-access-qvqsc\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") " pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.251467 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") " pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.251487 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3f624572-bbfe-4c9d-be6f-f8f647fd8aa2-config-data-default\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") " pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.251514 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3f624572-bbfe-4c9d-be6f-f8f647fd8aa2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") " pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.251560 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f624572-bbfe-4c9d-be6f-f8f647fd8aa2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") " pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.251594 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3f624572-bbfe-4c9d-be6f-f8f647fd8aa2-kolla-config\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") " pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.251621 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f624572-bbfe-4c9d-be6f-f8f647fd8aa2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") " pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.255762 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3f624572-bbfe-4c9d-be6f-f8f647fd8aa2-config-data-default\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") " pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.257104 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3f624572-bbfe-4c9d-be6f-f8f647fd8aa2-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") " pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.257174 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f624572-bbfe-4c9d-be6f-f8f647fd8aa2-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") " pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.257405 5036 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.257936 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3f624572-bbfe-4c9d-be6f-f8f647fd8aa2-kolla-config\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") " pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.264284 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f624572-bbfe-4c9d-be6f-f8f647fd8aa2-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") " pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.272869 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.274292 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f624572-bbfe-4c9d-be6f-f8f647fd8aa2-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") " pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.289225 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvqsc\" (UniqueName: \"kubernetes.io/projected/3f624572-bbfe-4c9d-be6f-f8f647fd8aa2-kube-api-access-qvqsc\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") " pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.333627 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2\") " pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.345219 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.376166 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.407570 5036 generic.go:334] "Generic (PLEG): container finished" podID="6b614213-d497-4882-9055-dc68dd058b01" containerID="cc21e92489404e4396554aa1d6bb669c52cefa54c05200a2f1d07e692b23f220" exitCode=0 Jan 10 16:42:14 crc kubenswrapper[5036]: I0110 16:42:14.407609 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65vvl" event={"ID":"6b614213-d497-4882-9055-dc68dd058b01","Type":"ContainerDied","Data":"cc21e92489404e4396554aa1d6bb669c52cefa54c05200a2f1d07e692b23f220"} Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.468206 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.470370 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.473474 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-zsj4k" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.475315 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.475457 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.475666 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.482224 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.577555 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") " pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.577604 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") " pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.578528 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") " pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.578580 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") " pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.579966 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") " pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.580001 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") " pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.580067 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx4hn\" (UniqueName: \"kubernetes.io/projected/78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2-kube-api-access-bx4hn\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") " pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.580127 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") " pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.682222 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") " pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.682314 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") " pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.682345 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") " pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.682373 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") " pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.682439 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") " pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.682468 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") " pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.682497 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx4hn\" (UniqueName: \"kubernetes.io/projected/78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2-kube-api-access-bx4hn\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") " pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.682543 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") " pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.683294 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") " pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.684484 5036 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.684672 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") " pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.687049 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") " pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.687149 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") " pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.705627 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") " pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.708568 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx4hn\" (UniqueName: \"kubernetes.io/projected/78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2-kube-api-access-bx4hn\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") " pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.717805 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") " pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.720875 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2\") " pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.747806 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.748979 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.750986 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.751228 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-gm9d9" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.751474 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.800267 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.811244 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.891033 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/066ba36b-3da0-4db3-8f19-13e5a5227ab5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"066ba36b-3da0-4db3-8f19-13e5a5227ab5\") " pod="openstack/memcached-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.891166 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szphl\" (UniqueName: \"kubernetes.io/projected/066ba36b-3da0-4db3-8f19-13e5a5227ab5-kube-api-access-szphl\") pod \"memcached-0\" (UID: \"066ba36b-3da0-4db3-8f19-13e5a5227ab5\") " pod="openstack/memcached-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.891196 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/066ba36b-3da0-4db3-8f19-13e5a5227ab5-config-data\") pod \"memcached-0\" (UID: \"066ba36b-3da0-4db3-8f19-13e5a5227ab5\") " pod="openstack/memcached-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.891228 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/066ba36b-3da0-4db3-8f19-13e5a5227ab5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"066ba36b-3da0-4db3-8f19-13e5a5227ab5\") " pod="openstack/memcached-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.892890 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/066ba36b-3da0-4db3-8f19-13e5a5227ab5-kolla-config\") pod \"memcached-0\" (UID: \"066ba36b-3da0-4db3-8f19-13e5a5227ab5\") " pod="openstack/memcached-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.994960 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/066ba36b-3da0-4db3-8f19-13e5a5227ab5-kolla-config\") pod \"memcached-0\" (UID: \"066ba36b-3da0-4db3-8f19-13e5a5227ab5\") " pod="openstack/memcached-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.995062 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/066ba36b-3da0-4db3-8f19-13e5a5227ab5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"066ba36b-3da0-4db3-8f19-13e5a5227ab5\") " pod="openstack/memcached-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.995101 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szphl\" (UniqueName: \"kubernetes.io/projected/066ba36b-3da0-4db3-8f19-13e5a5227ab5-kube-api-access-szphl\") pod \"memcached-0\" (UID: \"066ba36b-3da0-4db3-8f19-13e5a5227ab5\") " pod="openstack/memcached-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.995119 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/066ba36b-3da0-4db3-8f19-13e5a5227ab5-config-data\") pod \"memcached-0\" (UID: \"066ba36b-3da0-4db3-8f19-13e5a5227ab5\") " pod="openstack/memcached-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.995147 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/066ba36b-3da0-4db3-8f19-13e5a5227ab5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"066ba36b-3da0-4db3-8f19-13e5a5227ab5\") " pod="openstack/memcached-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.995941 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/066ba36b-3da0-4db3-8f19-13e5a5227ab5-kolla-config\") pod \"memcached-0\" (UID: \"066ba36b-3da0-4db3-8f19-13e5a5227ab5\") " pod="openstack/memcached-0" Jan 10 16:42:15 crc kubenswrapper[5036]: I0110 16:42:15.995949 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/066ba36b-3da0-4db3-8f19-13e5a5227ab5-config-data\") pod \"memcached-0\" (UID: \"066ba36b-3da0-4db3-8f19-13e5a5227ab5\") " pod="openstack/memcached-0" Jan 10 16:42:16 crc kubenswrapper[5036]: I0110 16:42:16.004018 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/066ba36b-3da0-4db3-8f19-13e5a5227ab5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"066ba36b-3da0-4db3-8f19-13e5a5227ab5\") " pod="openstack/memcached-0" Jan 10 16:42:16 crc kubenswrapper[5036]: I0110 16:42:16.004181 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/066ba36b-3da0-4db3-8f19-13e5a5227ab5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"066ba36b-3da0-4db3-8f19-13e5a5227ab5\") " pod="openstack/memcached-0" Jan 10 16:42:16 crc kubenswrapper[5036]: I0110 16:42:16.019378 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szphl\" (UniqueName: \"kubernetes.io/projected/066ba36b-3da0-4db3-8f19-13e5a5227ab5-kube-api-access-szphl\") pod \"memcached-0\" (UID: \"066ba36b-3da0-4db3-8f19-13e5a5227ab5\") " pod="openstack/memcached-0" Jan 10 16:42:16 crc kubenswrapper[5036]: I0110 16:42:16.080246 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 10 16:42:17 crc kubenswrapper[5036]: I0110 16:42:17.546723 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 10 16:42:17 crc kubenswrapper[5036]: I0110 16:42:17.548341 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 10 16:42:17 crc kubenswrapper[5036]: I0110 16:42:17.553372 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-x4dxs" Jan 10 16:42:17 crc kubenswrapper[5036]: I0110 16:42:17.553819 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 10 16:42:17 crc kubenswrapper[5036]: I0110 16:42:17.718573 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xb8b\" (UniqueName: \"kubernetes.io/projected/66dcc1cf-f7f9-4064-b019-4ec5f205ea03-kube-api-access-5xb8b\") pod \"kube-state-metrics-0\" (UID: \"66dcc1cf-f7f9-4064-b019-4ec5f205ea03\") " pod="openstack/kube-state-metrics-0" Jan 10 16:42:17 crc kubenswrapper[5036]: I0110 16:42:17.820216 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xb8b\" (UniqueName: \"kubernetes.io/projected/66dcc1cf-f7f9-4064-b019-4ec5f205ea03-kube-api-access-5xb8b\") pod \"kube-state-metrics-0\" (UID: \"66dcc1cf-f7f9-4064-b019-4ec5f205ea03\") " pod="openstack/kube-state-metrics-0" Jan 10 16:42:17 crc kubenswrapper[5036]: I0110 16:42:17.857252 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xb8b\" (UniqueName: \"kubernetes.io/projected/66dcc1cf-f7f9-4064-b019-4ec5f205ea03-kube-api-access-5xb8b\") pod \"kube-state-metrics-0\" (UID: \"66dcc1cf-f7f9-4064-b019-4ec5f205ea03\") " pod="openstack/kube-state-metrics-0" Jan 10 16:42:17 crc kubenswrapper[5036]: I0110 16:42:17.865527 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 10 16:42:20 crc kubenswrapper[5036]: I0110 16:42:20.931871 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-czqbw"] Jan 10 16:42:20 crc kubenswrapper[5036]: I0110 16:42:20.933411 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-czqbw" Jan 10 16:42:20 crc kubenswrapper[5036]: I0110 16:42:20.939907 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 10 16:42:20 crc kubenswrapper[5036]: I0110 16:42:20.940261 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-rxg7d" Jan 10 16:42:20 crc kubenswrapper[5036]: I0110 16:42:20.940413 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 10 16:42:20 crc kubenswrapper[5036]: I0110 16:42:20.940436 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-vsd6b"] Jan 10 16:42:20 crc kubenswrapper[5036]: I0110 16:42:20.941981 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-vsd6b" Jan 10 16:42:20 crc kubenswrapper[5036]: I0110 16:42:20.947698 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-czqbw"] Jan 10 16:42:20 crc kubenswrapper[5036]: I0110 16:42:20.974607 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-vsd6b"] Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.069992 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65d28afa-c448-4c8a-8fe9-062d9383f484-scripts\") pod \"ovn-controller-ovs-vsd6b\" (UID: \"65d28afa-c448-4c8a-8fe9-062d9383f484\") " pod="openstack/ovn-controller-ovs-vsd6b" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.070032 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/65d28afa-c448-4c8a-8fe9-062d9383f484-etc-ovs\") pod \"ovn-controller-ovs-vsd6b\" (UID: \"65d28afa-c448-4c8a-8fe9-062d9383f484\") " pod="openstack/ovn-controller-ovs-vsd6b" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.070078 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/65d28afa-c448-4c8a-8fe9-062d9383f484-var-run\") pod \"ovn-controller-ovs-vsd6b\" (UID: \"65d28afa-c448-4c8a-8fe9-062d9383f484\") " pod="openstack/ovn-controller-ovs-vsd6b" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.070097 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4-var-run-ovn\") pod \"ovn-controller-czqbw\" (UID: \"be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4\") " pod="openstack/ovn-controller-czqbw" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.070119 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/65d28afa-c448-4c8a-8fe9-062d9383f484-var-lib\") pod \"ovn-controller-ovs-vsd6b\" (UID: \"65d28afa-c448-4c8a-8fe9-062d9383f484\") " pod="openstack/ovn-controller-ovs-vsd6b" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.070150 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4-var-run\") pod \"ovn-controller-czqbw\" (UID: \"be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4\") " pod="openstack/ovn-controller-czqbw" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.070267 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/65d28afa-c448-4c8a-8fe9-062d9383f484-var-log\") pod \"ovn-controller-ovs-vsd6b\" (UID: \"65d28afa-c448-4c8a-8fe9-062d9383f484\") " pod="openstack/ovn-controller-ovs-vsd6b" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.070393 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjgm9\" (UniqueName: \"kubernetes.io/projected/65d28afa-c448-4c8a-8fe9-062d9383f484-kube-api-access-wjgm9\") pod \"ovn-controller-ovs-vsd6b\" (UID: \"65d28afa-c448-4c8a-8fe9-062d9383f484\") " pod="openstack/ovn-controller-ovs-vsd6b" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.070505 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z82d\" (UniqueName: \"kubernetes.io/projected/be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4-kube-api-access-4z82d\") pod \"ovn-controller-czqbw\" (UID: \"be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4\") " pod="openstack/ovn-controller-czqbw" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.070546 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4-ovn-controller-tls-certs\") pod \"ovn-controller-czqbw\" (UID: \"be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4\") " pod="openstack/ovn-controller-czqbw" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.070606 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4-combined-ca-bundle\") pod \"ovn-controller-czqbw\" (UID: \"be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4\") " pod="openstack/ovn-controller-czqbw" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.070632 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4-var-log-ovn\") pod \"ovn-controller-czqbw\" (UID: \"be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4\") " pod="openstack/ovn-controller-czqbw" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.070671 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4-scripts\") pod \"ovn-controller-czqbw\" (UID: \"be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4\") " pod="openstack/ovn-controller-czqbw" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.171856 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z82d\" (UniqueName: \"kubernetes.io/projected/be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4-kube-api-access-4z82d\") pod \"ovn-controller-czqbw\" (UID: \"be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4\") " pod="openstack/ovn-controller-czqbw" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.171901 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4-ovn-controller-tls-certs\") pod \"ovn-controller-czqbw\" (UID: \"be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4\") " pod="openstack/ovn-controller-czqbw" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.171928 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4-combined-ca-bundle\") pod \"ovn-controller-czqbw\" (UID: \"be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4\") " pod="openstack/ovn-controller-czqbw" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.171945 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4-var-log-ovn\") pod \"ovn-controller-czqbw\" (UID: \"be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4\") " pod="openstack/ovn-controller-czqbw" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.171965 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4-scripts\") pod \"ovn-controller-czqbw\" (UID: \"be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4\") " pod="openstack/ovn-controller-czqbw" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.171986 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65d28afa-c448-4c8a-8fe9-062d9383f484-scripts\") pod \"ovn-controller-ovs-vsd6b\" (UID: \"65d28afa-c448-4c8a-8fe9-062d9383f484\") " pod="openstack/ovn-controller-ovs-vsd6b" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.172001 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/65d28afa-c448-4c8a-8fe9-062d9383f484-etc-ovs\") pod \"ovn-controller-ovs-vsd6b\" (UID: \"65d28afa-c448-4c8a-8fe9-062d9383f484\") " pod="openstack/ovn-controller-ovs-vsd6b" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.172041 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/65d28afa-c448-4c8a-8fe9-062d9383f484-var-run\") pod \"ovn-controller-ovs-vsd6b\" (UID: \"65d28afa-c448-4c8a-8fe9-062d9383f484\") " pod="openstack/ovn-controller-ovs-vsd6b" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.172067 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4-var-run-ovn\") pod \"ovn-controller-czqbw\" (UID: \"be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4\") " pod="openstack/ovn-controller-czqbw" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.172091 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/65d28afa-c448-4c8a-8fe9-062d9383f484-var-lib\") pod \"ovn-controller-ovs-vsd6b\" (UID: \"65d28afa-c448-4c8a-8fe9-062d9383f484\") " pod="openstack/ovn-controller-ovs-vsd6b" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.172121 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4-var-run\") pod \"ovn-controller-czqbw\" (UID: \"be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4\") " pod="openstack/ovn-controller-czqbw" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.172211 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/65d28afa-c448-4c8a-8fe9-062d9383f484-var-log\") pod \"ovn-controller-ovs-vsd6b\" (UID: \"65d28afa-c448-4c8a-8fe9-062d9383f484\") " pod="openstack/ovn-controller-ovs-vsd6b" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.172257 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjgm9\" (UniqueName: \"kubernetes.io/projected/65d28afa-c448-4c8a-8fe9-062d9383f484-kube-api-access-wjgm9\") pod \"ovn-controller-ovs-vsd6b\" (UID: \"65d28afa-c448-4c8a-8fe9-062d9383f484\") " pod="openstack/ovn-controller-ovs-vsd6b" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.172778 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/65d28afa-c448-4c8a-8fe9-062d9383f484-var-run\") pod \"ovn-controller-ovs-vsd6b\" (UID: \"65d28afa-c448-4c8a-8fe9-062d9383f484\") " pod="openstack/ovn-controller-ovs-vsd6b" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.172855 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4-var-run-ovn\") pod \"ovn-controller-czqbw\" (UID: \"be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4\") " pod="openstack/ovn-controller-czqbw" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.173038 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4-var-run\") pod \"ovn-controller-czqbw\" (UID: \"be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4\") " pod="openstack/ovn-controller-czqbw" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.173264 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/65d28afa-c448-4c8a-8fe9-062d9383f484-var-lib\") pod \"ovn-controller-ovs-vsd6b\" (UID: \"65d28afa-c448-4c8a-8fe9-062d9383f484\") " pod="openstack/ovn-controller-ovs-vsd6b" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.173487 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4-var-log-ovn\") pod \"ovn-controller-czqbw\" (UID: \"be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4\") " pod="openstack/ovn-controller-czqbw" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.173498 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/65d28afa-c448-4c8a-8fe9-062d9383f484-var-log\") pod \"ovn-controller-ovs-vsd6b\" (UID: \"65d28afa-c448-4c8a-8fe9-062d9383f484\") " pod="openstack/ovn-controller-ovs-vsd6b" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.173861 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/65d28afa-c448-4c8a-8fe9-062d9383f484-etc-ovs\") pod \"ovn-controller-ovs-vsd6b\" (UID: \"65d28afa-c448-4c8a-8fe9-062d9383f484\") " pod="openstack/ovn-controller-ovs-vsd6b" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.175967 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65d28afa-c448-4c8a-8fe9-062d9383f484-scripts\") pod \"ovn-controller-ovs-vsd6b\" (UID: \"65d28afa-c448-4c8a-8fe9-062d9383f484\") " pod="openstack/ovn-controller-ovs-vsd6b" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.176081 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4-scripts\") pod \"ovn-controller-czqbw\" (UID: \"be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4\") " pod="openstack/ovn-controller-czqbw" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.183868 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4-combined-ca-bundle\") pod \"ovn-controller-czqbw\" (UID: \"be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4\") " pod="openstack/ovn-controller-czqbw" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.183917 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4-ovn-controller-tls-certs\") pod \"ovn-controller-czqbw\" (UID: \"be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4\") " pod="openstack/ovn-controller-czqbw" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.202503 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z82d\" (UniqueName: \"kubernetes.io/projected/be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4-kube-api-access-4z82d\") pod \"ovn-controller-czqbw\" (UID: \"be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4\") " pod="openstack/ovn-controller-czqbw" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.203046 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjgm9\" (UniqueName: \"kubernetes.io/projected/65d28afa-c448-4c8a-8fe9-062d9383f484-kube-api-access-wjgm9\") pod \"ovn-controller-ovs-vsd6b\" (UID: \"65d28afa-c448-4c8a-8fe9-062d9383f484\") " pod="openstack/ovn-controller-ovs-vsd6b" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.259139 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-czqbw" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.266353 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-vsd6b" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.464732 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8146d758-62d6-4640-86f8-51b89a8a8519","Type":"ContainerStarted","Data":"ec35042f1f0a13b49eeaf11f6a93f876702ecf50ec43890d2ac0ebdd3dc0d7c9"} Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.469560 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cd708bfb-a557-401f-b815-16d584c8eb78","Type":"ContainerStarted","Data":"7d08dbf4df089f307d08c2f62ea15f05e60efd7b821c66f6c644139bf604a74c"} Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.837955 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.840118 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.842723 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.842890 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.844589 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.844711 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-lb78c" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.849004 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.850125 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.896433 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b642befa-dd18-4984-b74f-d3945ee06f7d-config\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") " pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.896539 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") " pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.896564 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b642befa-dd18-4984-b74f-d3945ee06f7d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") " pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.896589 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b642befa-dd18-4984-b74f-d3945ee06f7d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") " pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.896621 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b642befa-dd18-4984-b74f-d3945ee06f7d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") " pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.896664 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b642befa-dd18-4984-b74f-d3945ee06f7d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") " pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.896716 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b642befa-dd18-4984-b74f-d3945ee06f7d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") " pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.896740 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ksm9\" (UniqueName: \"kubernetes.io/projected/b642befa-dd18-4984-b74f-d3945ee06f7d-kube-api-access-7ksm9\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") " pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.998765 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b642befa-dd18-4984-b74f-d3945ee06f7d-config\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") " pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.998879 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") " pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.998902 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b642befa-dd18-4984-b74f-d3945ee06f7d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") " pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.998928 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b642befa-dd18-4984-b74f-d3945ee06f7d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") " pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.998964 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b642befa-dd18-4984-b74f-d3945ee06f7d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") " pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.999017 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b642befa-dd18-4984-b74f-d3945ee06f7d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") " pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.999045 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b642befa-dd18-4984-b74f-d3945ee06f7d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") " pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.999072 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ksm9\" (UniqueName: \"kubernetes.io/projected/b642befa-dd18-4984-b74f-d3945ee06f7d-kube-api-access-7ksm9\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") " pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:21 crc kubenswrapper[5036]: I0110 16:42:21.999801 5036 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:22 crc kubenswrapper[5036]: I0110 16:42:22.000178 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b642befa-dd18-4984-b74f-d3945ee06f7d-config\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") " pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:22 crc kubenswrapper[5036]: I0110 16:42:22.000650 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b642befa-dd18-4984-b74f-d3945ee06f7d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") " pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:22 crc kubenswrapper[5036]: I0110 16:42:22.001534 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b642befa-dd18-4984-b74f-d3945ee06f7d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") " pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:22 crc kubenswrapper[5036]: I0110 16:42:22.004717 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b642befa-dd18-4984-b74f-d3945ee06f7d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") " pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:22 crc kubenswrapper[5036]: I0110 16:42:22.010464 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b642befa-dd18-4984-b74f-d3945ee06f7d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") " pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:22 crc kubenswrapper[5036]: I0110 16:42:22.011508 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b642befa-dd18-4984-b74f-d3945ee06f7d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") " pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:22 crc kubenswrapper[5036]: I0110 16:42:22.018069 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ksm9\" (UniqueName: \"kubernetes.io/projected/b642befa-dd18-4984-b74f-d3945ee06f7d-kube-api-access-7ksm9\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") " pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:22 crc kubenswrapper[5036]: I0110 16:42:22.022478 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"b642befa-dd18-4984-b74f-d3945ee06f7d\") " pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:22 crc kubenswrapper[5036]: I0110 16:42:22.202615 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:24 crc kubenswrapper[5036]: I0110 16:42:24.996339 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 10 16:42:24 crc kubenswrapper[5036]: I0110 16:42:24.997822 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.000033 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-46wvh" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.000443 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.001737 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.003309 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.009766 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.041045 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f74eaf1-cd39-41dc-8c0a-170373e863e5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") " pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.041362 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cxpv\" (UniqueName: \"kubernetes.io/projected/4f74eaf1-cd39-41dc-8c0a-170373e863e5-kube-api-access-6cxpv\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") " pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.041564 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f74eaf1-cd39-41dc-8c0a-170373e863e5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") " pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.041719 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f74eaf1-cd39-41dc-8c0a-170373e863e5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") " pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.041913 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") " pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.042077 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f74eaf1-cd39-41dc-8c0a-170373e863e5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") " pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.043185 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f74eaf1-cd39-41dc-8c0a-170373e863e5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") " pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.043272 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f74eaf1-cd39-41dc-8c0a-170373e863e5-config\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") " pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.145617 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f74eaf1-cd39-41dc-8c0a-170373e863e5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") " pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.145729 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f74eaf1-cd39-41dc-8c0a-170373e863e5-config\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") " pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.145805 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f74eaf1-cd39-41dc-8c0a-170373e863e5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") " pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.145833 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cxpv\" (UniqueName: \"kubernetes.io/projected/4f74eaf1-cd39-41dc-8c0a-170373e863e5-kube-api-access-6cxpv\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") " pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.145901 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f74eaf1-cd39-41dc-8c0a-170373e863e5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") " pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.145940 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f74eaf1-cd39-41dc-8c0a-170373e863e5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") " pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.145974 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") " pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.145995 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f74eaf1-cd39-41dc-8c0a-170373e863e5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") " pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.146738 5036 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.146961 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f74eaf1-cd39-41dc-8c0a-170373e863e5-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") " pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.147246 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f74eaf1-cd39-41dc-8c0a-170373e863e5-config\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") " pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.147491 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f74eaf1-cd39-41dc-8c0a-170373e863e5-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") " pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.153389 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f74eaf1-cd39-41dc-8c0a-170373e863e5-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") " pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.153897 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f74eaf1-cd39-41dc-8c0a-170373e863e5-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") " pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.157297 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f74eaf1-cd39-41dc-8c0a-170373e863e5-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") " pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.168000 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") " pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.168757 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cxpv\" (UniqueName: \"kubernetes.io/projected/4f74eaf1-cd39-41dc-8c0a-170373e863e5-kube-api-access-6cxpv\") pod \"ovsdbserver-sb-0\" (UID: \"4f74eaf1-cd39-41dc-8c0a-170373e863e5\") " pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:25 crc kubenswrapper[5036]: I0110 16:42:25.322655 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:30 crc kubenswrapper[5036]: I0110 16:42:30.676515 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 10 16:42:31 crc kubenswrapper[5036]: E0110 16:42:31.160562 5036 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 10 16:42:31 crc kubenswrapper[5036]: E0110 16:42:31.161337 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wxjzg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-pssbt_openstack(4c4841ee-0a5e-40b3-8485-6c58a0229301): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 10 16:42:31 crc kubenswrapper[5036]: E0110 16:42:31.162514 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-pssbt" podUID="4c4841ee-0a5e-40b3-8485-6c58a0229301" Jan 10 16:42:31 crc kubenswrapper[5036]: E0110 16:42:31.173968 5036 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 10 16:42:31 crc kubenswrapper[5036]: E0110 16:42:31.174145 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pzrlh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-vccnd_openstack(7b4b1dc6-7568-41aa-926b-014a274ef1d6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 10 16:42:31 crc kubenswrapper[5036]: E0110 16:42:31.175307 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-vccnd" podUID="7b4b1dc6-7568-41aa-926b-014a274ef1d6" Jan 10 16:42:31 crc kubenswrapper[5036]: E0110 16:42:31.180918 5036 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 10 16:42:31 crc kubenswrapper[5036]: E0110 16:42:31.181055 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k9h7n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-49slj_openstack(cbdbb880-4817-4aa5-88a8-ba637b9ea219): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 10 16:42:31 crc kubenswrapper[5036]: E0110 16:42:31.182197 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-49slj" podUID="cbdbb880-4817-4aa5-88a8-ba637b9ea219" Jan 10 16:42:31 crc kubenswrapper[5036]: E0110 16:42:31.567573 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-vccnd" podUID="7b4b1dc6-7568-41aa-926b-014a274ef1d6" Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.341586 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-49slj" Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.342577 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-pssbt" Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.447974 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxjzg\" (UniqueName: \"kubernetes.io/projected/4c4841ee-0a5e-40b3-8485-6c58a0229301-kube-api-access-wxjzg\") pod \"4c4841ee-0a5e-40b3-8485-6c58a0229301\" (UID: \"4c4841ee-0a5e-40b3-8485-6c58a0229301\") " Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.448037 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbdbb880-4817-4aa5-88a8-ba637b9ea219-config\") pod \"cbdbb880-4817-4aa5-88a8-ba637b9ea219\" (UID: \"cbdbb880-4817-4aa5-88a8-ba637b9ea219\") " Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.448144 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbdbb880-4817-4aa5-88a8-ba637b9ea219-dns-svc\") pod \"cbdbb880-4817-4aa5-88a8-ba637b9ea219\" (UID: \"cbdbb880-4817-4aa5-88a8-ba637b9ea219\") " Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.448165 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c4841ee-0a5e-40b3-8485-6c58a0229301-config\") pod \"4c4841ee-0a5e-40b3-8485-6c58a0229301\" (UID: \"4c4841ee-0a5e-40b3-8485-6c58a0229301\") " Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.448230 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9h7n\" (UniqueName: \"kubernetes.io/projected/cbdbb880-4817-4aa5-88a8-ba637b9ea219-kube-api-access-k9h7n\") pod \"cbdbb880-4817-4aa5-88a8-ba637b9ea219\" (UID: \"cbdbb880-4817-4aa5-88a8-ba637b9ea219\") " Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.449561 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbdbb880-4817-4aa5-88a8-ba637b9ea219-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cbdbb880-4817-4aa5-88a8-ba637b9ea219" (UID: "cbdbb880-4817-4aa5-88a8-ba637b9ea219"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.450001 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbdbb880-4817-4aa5-88a8-ba637b9ea219-config" (OuterVolumeSpecName: "config") pod "cbdbb880-4817-4aa5-88a8-ba637b9ea219" (UID: "cbdbb880-4817-4aa5-88a8-ba637b9ea219"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.450518 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c4841ee-0a5e-40b3-8485-6c58a0229301-config" (OuterVolumeSpecName: "config") pod "4c4841ee-0a5e-40b3-8485-6c58a0229301" (UID: "4c4841ee-0a5e-40b3-8485-6c58a0229301"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.454386 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c4841ee-0a5e-40b3-8485-6c58a0229301-kube-api-access-wxjzg" (OuterVolumeSpecName: "kube-api-access-wxjzg") pod "4c4841ee-0a5e-40b3-8485-6c58a0229301" (UID: "4c4841ee-0a5e-40b3-8485-6c58a0229301"). InnerVolumeSpecName "kube-api-access-wxjzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.454946 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbdbb880-4817-4aa5-88a8-ba637b9ea219-kube-api-access-k9h7n" (OuterVolumeSpecName: "kube-api-access-k9h7n") pod "cbdbb880-4817-4aa5-88a8-ba637b9ea219" (UID: "cbdbb880-4817-4aa5-88a8-ba637b9ea219"). InnerVolumeSpecName "kube-api-access-k9h7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.549710 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxjzg\" (UniqueName: \"kubernetes.io/projected/4c4841ee-0a5e-40b3-8485-6c58a0229301-kube-api-access-wxjzg\") on node \"crc\" DevicePath \"\"" Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.549759 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbdbb880-4817-4aa5-88a8-ba637b9ea219-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.549772 5036 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbdbb880-4817-4aa5-88a8-ba637b9ea219-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.549783 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c4841ee-0a5e-40b3-8485-6c58a0229301-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.549794 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9h7n\" (UniqueName: \"kubernetes.io/projected/cbdbb880-4817-4aa5-88a8-ba637b9ea219-kube-api-access-k9h7n\") on node \"crc\" DevicePath \"\"" Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.571011 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-pssbt" event={"ID":"4c4841ee-0a5e-40b3-8485-6c58a0229301","Type":"ContainerDied","Data":"3f36a01bc98e1cd313e01c4f2f69a67e8936b21968851cc23b32e8bb5e4ac6e0"} Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.571074 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-pssbt" Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.577940 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-49slj" event={"ID":"cbdbb880-4817-4aa5-88a8-ba637b9ea219","Type":"ContainerDied","Data":"095acce0d5ae9f5d845945fbf76ed367326e5013c21de334874b41a7fc385add"} Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.577948 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-49slj" Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.579495 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2","Type":"ContainerStarted","Data":"b3658819fdf11af7842515a0ecc8aa50e9ac24fafbf7cfb36fae9f7e04de231b"} Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.600860 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.619925 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pssbt"] Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.633044 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-pssbt"] Jan 10 16:42:32 crc kubenswrapper[5036]: W0110 16:42:32.638646 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78b8c3a9_e6b8_4f1a_b0a4_5370e9e5e2f2.slice/crio-57abf8c253de693a61280fcd097099a877f26c7d0c605b9256fec6a1792b793c WatchSource:0}: Error finding container 57abf8c253de693a61280fcd097099a877f26c7d0c605b9256fec6a1792b793c: Status 404 returned error can't find the container with id 57abf8c253de693a61280fcd097099a877f26c7d0c605b9256fec6a1792b793c Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.670831 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-49slj"] Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.681253 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-49slj"] Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.714942 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.807664 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 10 16:42:32 crc kubenswrapper[5036]: W0110 16:42:32.815907 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66dcc1cf_f7f9_4064_b019_4ec5f205ea03.slice/crio-427a1549c177d1cb90bb460a678a5aa958f202bae0caeebb3303cc6fac996785 WatchSource:0}: Error finding container 427a1549c177d1cb90bb460a678a5aa958f202bae0caeebb3303cc6fac996785: Status 404 returned error can't find the container with id 427a1549c177d1cb90bb460a678a5aa958f202bae0caeebb3303cc6fac996785 Jan 10 16:42:32 crc kubenswrapper[5036]: I0110 16:42:32.910490 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 10 16:42:32 crc kubenswrapper[5036]: W0110 16:42:32.913768 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb642befa_dd18_4984_b74f_d3945ee06f7d.slice/crio-7aa29402c9e68a1542884c38f4cc167e0506ff36dc924507a7da2ba6b27f8846 WatchSource:0}: Error finding container 7aa29402c9e68a1542884c38f4cc167e0506ff36dc924507a7da2ba6b27f8846: Status 404 returned error can't find the container with id 7aa29402c9e68a1542884c38f4cc167e0506ff36dc924507a7da2ba6b27f8846 Jan 10 16:42:33 crc kubenswrapper[5036]: I0110 16:42:33.016611 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-vsd6b"] Jan 10 16:42:33 crc kubenswrapper[5036]: W0110 16:42:33.017438 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65d28afa_c448_4c8a_8fe9_062d9383f484.slice/crio-cacc4e9ad0b54f376654955b554eb2a03e43d16d9102e5f3c667934dbfb2439f WatchSource:0}: Error finding container cacc4e9ad0b54f376654955b554eb2a03e43d16d9102e5f3c667934dbfb2439f: Status 404 returned error can't find the container with id cacc4e9ad0b54f376654955b554eb2a03e43d16d9102e5f3c667934dbfb2439f Jan 10 16:42:33 crc kubenswrapper[5036]: I0110 16:42:33.085376 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-czqbw"] Jan 10 16:42:33 crc kubenswrapper[5036]: W0110 16:42:33.091275 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe4f7b3d_ab10_498f_ac5a_9b37dafcd5f4.slice/crio-90eee8d3dbc4664319776e1ba68fc9469d4df72bdd8925094ceec250f6282806 WatchSource:0}: Error finding container 90eee8d3dbc4664319776e1ba68fc9469d4df72bdd8925094ceec250f6282806: Status 404 returned error can't find the container with id 90eee8d3dbc4664319776e1ba68fc9469d4df72bdd8925094ceec250f6282806 Jan 10 16:42:33 crc kubenswrapper[5036]: I0110 16:42:33.107342 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 10 16:42:33 crc kubenswrapper[5036]: W0110 16:42:33.112191 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f74eaf1_cd39_41dc_8c0a_170373e863e5.slice/crio-f0825a49face9bfb05422fc617de048a8a8adf6ecb3b68e40522aaf21f2e72cf WatchSource:0}: Error finding container f0825a49face9bfb05422fc617de048a8a8adf6ecb3b68e40522aaf21f2e72cf: Status 404 returned error can't find the container with id f0825a49face9bfb05422fc617de048a8a8adf6ecb3b68e40522aaf21f2e72cf Jan 10 16:42:33 crc kubenswrapper[5036]: I0110 16:42:33.587916 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"66dcc1cf-f7f9-4064-b019-4ec5f205ea03","Type":"ContainerStarted","Data":"427a1549c177d1cb90bb460a678a5aa958f202bae0caeebb3303cc6fac996785"} Jan 10 16:42:33 crc kubenswrapper[5036]: I0110 16:42:33.590081 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65vvl" event={"ID":"6b614213-d497-4882-9055-dc68dd058b01","Type":"ContainerStarted","Data":"b38501eb98d118f20f0286b4234d2a328046ab9c0e36a7a16932107efc4a4163"} Jan 10 16:42:33 crc kubenswrapper[5036]: I0110 16:42:33.591223 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4f74eaf1-cd39-41dc-8c0a-170373e863e5","Type":"ContainerStarted","Data":"f0825a49face9bfb05422fc617de048a8a8adf6ecb3b68e40522aaf21f2e72cf"} Jan 10 16:42:33 crc kubenswrapper[5036]: I0110 16:42:33.592451 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b642befa-dd18-4984-b74f-d3945ee06f7d","Type":"ContainerStarted","Data":"7aa29402c9e68a1542884c38f4cc167e0506ff36dc924507a7da2ba6b27f8846"} Jan 10 16:42:33 crc kubenswrapper[5036]: I0110 16:42:33.593584 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-czqbw" event={"ID":"be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4","Type":"ContainerStarted","Data":"90eee8d3dbc4664319776e1ba68fc9469d4df72bdd8925094ceec250f6282806"} Jan 10 16:42:33 crc kubenswrapper[5036]: I0110 16:42:33.594348 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2","Type":"ContainerStarted","Data":"57abf8c253de693a61280fcd097099a877f26c7d0c605b9256fec6a1792b793c"} Jan 10 16:42:33 crc kubenswrapper[5036]: I0110 16:42:33.595488 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vsd6b" event={"ID":"65d28afa-c448-4c8a-8fe9-062d9383f484","Type":"ContainerStarted","Data":"cacc4e9ad0b54f376654955b554eb2a03e43d16d9102e5f3c667934dbfb2439f"} Jan 10 16:42:33 crc kubenswrapper[5036]: I0110 16:42:33.596631 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"066ba36b-3da0-4db3-8f19-13e5a5227ab5","Type":"ContainerStarted","Data":"e8d3f9fda6c71f6c193c1752cb3586c85a70cfe6f427a97d41e207a4d12a61e8"} Jan 10 16:42:34 crc kubenswrapper[5036]: I0110 16:42:34.518094 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c4841ee-0a5e-40b3-8485-6c58a0229301" path="/var/lib/kubelet/pods/4c4841ee-0a5e-40b3-8485-6c58a0229301/volumes" Jan 10 16:42:34 crc kubenswrapper[5036]: I0110 16:42:34.518442 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbdbb880-4817-4aa5-88a8-ba637b9ea219" path="/var/lib/kubelet/pods/cbdbb880-4817-4aa5-88a8-ba637b9ea219/volumes" Jan 10 16:42:34 crc kubenswrapper[5036]: I0110 16:42:34.607215 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vhkcd" event={"ID":"f4a7f810-b3cd-4699-9ac4-b09e68779b5f","Type":"ContainerStarted","Data":"47b9a7ad1c5fce0bfe89393197d8537afd219923643fdbc434656da54dadbe16"} Jan 10 16:42:36 crc kubenswrapper[5036]: I0110 16:42:36.647462 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-65vvl" podStartSLOduration=6.460544106 podStartE2EDuration="27.647441974s" podCreationTimestamp="2026-01-10 16:42:09 +0000 UTC" firstStartedPulling="2026-01-10 16:42:11.367465542 +0000 UTC m=+853.237701036" lastFinishedPulling="2026-01-10 16:42:32.55436341 +0000 UTC m=+874.424598904" observedRunningTime="2026-01-10 16:42:36.643405381 +0000 UTC m=+878.513640895" watchObservedRunningTime="2026-01-10 16:42:36.647441974 +0000 UTC m=+878.517677468" Jan 10 16:42:37 crc kubenswrapper[5036]: I0110 16:42:37.632785 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cd708bfb-a557-401f-b815-16d584c8eb78","Type":"ContainerStarted","Data":"40a904c4742ed367fe558a46a911f8146836480ee5820dd2aeb7d14fee4a18f4"} Jan 10 16:42:37 crc kubenswrapper[5036]: I0110 16:42:37.635719 5036 generic.go:334] "Generic (PLEG): container finished" podID="f4a7f810-b3cd-4699-9ac4-b09e68779b5f" containerID="47b9a7ad1c5fce0bfe89393197d8537afd219923643fdbc434656da54dadbe16" exitCode=0 Jan 10 16:42:37 crc kubenswrapper[5036]: I0110 16:42:37.635817 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vhkcd" event={"ID":"f4a7f810-b3cd-4699-9ac4-b09e68779b5f","Type":"ContainerDied","Data":"47b9a7ad1c5fce0bfe89393197d8537afd219923643fdbc434656da54dadbe16"} Jan 10 16:42:37 crc kubenswrapper[5036]: I0110 16:42:37.637663 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8146d758-62d6-4640-86f8-51b89a8a8519","Type":"ContainerStarted","Data":"140d035c5adeb766202b21920371d68bd36600fed05e2465e654516163e8857e"} Jan 10 16:42:38 crc kubenswrapper[5036]: I0110 16:42:38.651783 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vhkcd" event={"ID":"f4a7f810-b3cd-4699-9ac4-b09e68779b5f","Type":"ContainerStarted","Data":"ae7e21bf395067153d0c94d4312f3124a22507d499d9989539f1915fdbbdf22d"} Jan 10 16:42:38 crc kubenswrapper[5036]: I0110 16:42:38.652651 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-vhkcd" Jan 10 16:42:38 crc kubenswrapper[5036]: I0110 16:42:38.685998 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-vhkcd" podStartSLOduration=7.707430236 podStartE2EDuration="27.685978723s" podCreationTimestamp="2026-01-10 16:42:11 +0000 UTC" firstStartedPulling="2026-01-10 16:42:12.577757618 +0000 UTC m=+854.447993112" lastFinishedPulling="2026-01-10 16:42:32.556306105 +0000 UTC m=+874.426541599" observedRunningTime="2026-01-10 16:42:38.683742501 +0000 UTC m=+880.553977995" watchObservedRunningTime="2026-01-10 16:42:38.685978723 +0000 UTC m=+880.556214217" Jan 10 16:42:39 crc kubenswrapper[5036]: I0110 16:42:39.795451 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-65vvl" Jan 10 16:42:39 crc kubenswrapper[5036]: I0110 16:42:39.795490 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-65vvl" Jan 10 16:42:39 crc kubenswrapper[5036]: I0110 16:42:39.853396 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-65vvl" Jan 10 16:42:40 crc kubenswrapper[5036]: I0110 16:42:40.719734 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-65vvl" Jan 10 16:42:40 crc kubenswrapper[5036]: I0110 16:42:40.767143 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-65vvl"] Jan 10 16:42:42 crc kubenswrapper[5036]: I0110 16:42:42.086925 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-vhkcd" Jan 10 16:42:42 crc kubenswrapper[5036]: I0110 16:42:42.264833 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vccnd"] Jan 10 16:42:42 crc kubenswrapper[5036]: I0110 16:42:42.735040 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-65vvl" podUID="6b614213-d497-4882-9055-dc68dd058b01" containerName="registry-server" containerID="cri-o://b38501eb98d118f20f0286b4234d2a328046ab9c0e36a7a16932107efc4a4163" gracePeriod=2 Jan 10 16:42:43 crc kubenswrapper[5036]: I0110 16:42:43.749586 5036 generic.go:334] "Generic (PLEG): container finished" podID="6b614213-d497-4882-9055-dc68dd058b01" containerID="b38501eb98d118f20f0286b4234d2a328046ab9c0e36a7a16932107efc4a4163" exitCode=0 Jan 10 16:42:43 crc kubenswrapper[5036]: I0110 16:42:43.749715 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65vvl" event={"ID":"6b614213-d497-4882-9055-dc68dd058b01","Type":"ContainerDied","Data":"b38501eb98d118f20f0286b4234d2a328046ab9c0e36a7a16932107efc4a4163"} Jan 10 16:42:44 crc kubenswrapper[5036]: I0110 16:42:44.035908 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-vccnd" Jan 10 16:42:44 crc kubenswrapper[5036]: I0110 16:42:44.161220 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b4b1dc6-7568-41aa-926b-014a274ef1d6-dns-svc\") pod \"7b4b1dc6-7568-41aa-926b-014a274ef1d6\" (UID: \"7b4b1dc6-7568-41aa-926b-014a274ef1d6\") " Jan 10 16:42:44 crc kubenswrapper[5036]: I0110 16:42:44.161337 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b4b1dc6-7568-41aa-926b-014a274ef1d6-config\") pod \"7b4b1dc6-7568-41aa-926b-014a274ef1d6\" (UID: \"7b4b1dc6-7568-41aa-926b-014a274ef1d6\") " Jan 10 16:42:44 crc kubenswrapper[5036]: I0110 16:42:44.161483 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzrlh\" (UniqueName: \"kubernetes.io/projected/7b4b1dc6-7568-41aa-926b-014a274ef1d6-kube-api-access-pzrlh\") pod \"7b4b1dc6-7568-41aa-926b-014a274ef1d6\" (UID: \"7b4b1dc6-7568-41aa-926b-014a274ef1d6\") " Jan 10 16:42:44 crc kubenswrapper[5036]: I0110 16:42:44.161895 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b4b1dc6-7568-41aa-926b-014a274ef1d6-config" (OuterVolumeSpecName: "config") pod "7b4b1dc6-7568-41aa-926b-014a274ef1d6" (UID: "7b4b1dc6-7568-41aa-926b-014a274ef1d6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:42:44 crc kubenswrapper[5036]: I0110 16:42:44.161935 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b4b1dc6-7568-41aa-926b-014a274ef1d6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7b4b1dc6-7568-41aa-926b-014a274ef1d6" (UID: "7b4b1dc6-7568-41aa-926b-014a274ef1d6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:42:44 crc kubenswrapper[5036]: I0110 16:42:44.168189 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b4b1dc6-7568-41aa-926b-014a274ef1d6-kube-api-access-pzrlh" (OuterVolumeSpecName: "kube-api-access-pzrlh") pod "7b4b1dc6-7568-41aa-926b-014a274ef1d6" (UID: "7b4b1dc6-7568-41aa-926b-014a274ef1d6"). InnerVolumeSpecName "kube-api-access-pzrlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:42:44 crc kubenswrapper[5036]: I0110 16:42:44.263451 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzrlh\" (UniqueName: \"kubernetes.io/projected/7b4b1dc6-7568-41aa-926b-014a274ef1d6-kube-api-access-pzrlh\") on node \"crc\" DevicePath \"\"" Jan 10 16:42:44 crc kubenswrapper[5036]: I0110 16:42:44.263509 5036 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b4b1dc6-7568-41aa-926b-014a274ef1d6-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 10 16:42:44 crc kubenswrapper[5036]: I0110 16:42:44.263524 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b4b1dc6-7568-41aa-926b-014a274ef1d6-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:42:44 crc kubenswrapper[5036]: I0110 16:42:44.761879 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-vccnd" event={"ID":"7b4b1dc6-7568-41aa-926b-014a274ef1d6","Type":"ContainerDied","Data":"ea1ead3cf516e90ec3250232e4dc6a2ae2feb7fb5fc8fca1c0a31ced1f06ef5a"} Jan 10 16:42:44 crc kubenswrapper[5036]: I0110 16:42:44.761956 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-vccnd" Jan 10 16:42:44 crc kubenswrapper[5036]: I0110 16:42:44.812413 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vccnd"] Jan 10 16:42:44 crc kubenswrapper[5036]: I0110 16:42:44.820602 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-vccnd"] Jan 10 16:42:46 crc kubenswrapper[5036]: I0110 16:42:46.515922 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b4b1dc6-7568-41aa-926b-014a274ef1d6" path="/var/lib/kubelet/pods/7b4b1dc6-7568-41aa-926b-014a274ef1d6/volumes" Jan 10 16:42:46 crc kubenswrapper[5036]: I0110 16:42:46.673069 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-65vvl" Jan 10 16:42:46 crc kubenswrapper[5036]: I0110 16:42:46.700943 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b614213-d497-4882-9055-dc68dd058b01-catalog-content\") pod \"6b614213-d497-4882-9055-dc68dd058b01\" (UID: \"6b614213-d497-4882-9055-dc68dd058b01\") " Jan 10 16:42:46 crc kubenswrapper[5036]: I0110 16:42:46.701107 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b614213-d497-4882-9055-dc68dd058b01-utilities\") pod \"6b614213-d497-4882-9055-dc68dd058b01\" (UID: \"6b614213-d497-4882-9055-dc68dd058b01\") " Jan 10 16:42:46 crc kubenswrapper[5036]: I0110 16:42:46.701271 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs4bl\" (UniqueName: \"kubernetes.io/projected/6b614213-d497-4882-9055-dc68dd058b01-kube-api-access-fs4bl\") pod \"6b614213-d497-4882-9055-dc68dd058b01\" (UID: \"6b614213-d497-4882-9055-dc68dd058b01\") " Jan 10 16:42:46 crc kubenswrapper[5036]: I0110 16:42:46.704841 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b614213-d497-4882-9055-dc68dd058b01-utilities" (OuterVolumeSpecName: "utilities") pod "6b614213-d497-4882-9055-dc68dd058b01" (UID: "6b614213-d497-4882-9055-dc68dd058b01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:42:46 crc kubenswrapper[5036]: I0110 16:42:46.714367 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b614213-d497-4882-9055-dc68dd058b01-kube-api-access-fs4bl" (OuterVolumeSpecName: "kube-api-access-fs4bl") pod "6b614213-d497-4882-9055-dc68dd058b01" (UID: "6b614213-d497-4882-9055-dc68dd058b01"). InnerVolumeSpecName "kube-api-access-fs4bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:42:46 crc kubenswrapper[5036]: I0110 16:42:46.755501 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b614213-d497-4882-9055-dc68dd058b01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b614213-d497-4882-9055-dc68dd058b01" (UID: "6b614213-d497-4882-9055-dc68dd058b01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:42:46 crc kubenswrapper[5036]: I0110 16:42:46.781074 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65vvl" event={"ID":"6b614213-d497-4882-9055-dc68dd058b01","Type":"ContainerDied","Data":"4fcd53dfe451d35e1d5ef005c8ddda1c71d153be6e0ac67d5594304f68e8b9f2"} Jan 10 16:42:46 crc kubenswrapper[5036]: I0110 16:42:46.781094 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-65vvl" Jan 10 16:42:46 crc kubenswrapper[5036]: I0110 16:42:46.781504 5036 scope.go:117] "RemoveContainer" containerID="b38501eb98d118f20f0286b4234d2a328046ab9c0e36a7a16932107efc4a4163" Jan 10 16:42:46 crc kubenswrapper[5036]: I0110 16:42:46.803413 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fs4bl\" (UniqueName: \"kubernetes.io/projected/6b614213-d497-4882-9055-dc68dd058b01-kube-api-access-fs4bl\") on node \"crc\" DevicePath \"\"" Jan 10 16:42:46 crc kubenswrapper[5036]: I0110 16:42:46.803531 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b614213-d497-4882-9055-dc68dd058b01-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 16:42:46 crc kubenswrapper[5036]: I0110 16:42:46.803644 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b614213-d497-4882-9055-dc68dd058b01-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 16:42:46 crc kubenswrapper[5036]: I0110 16:42:46.818273 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-65vvl"] Jan 10 16:42:46 crc kubenswrapper[5036]: I0110 16:42:46.823956 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-65vvl"] Jan 10 16:42:47 crc kubenswrapper[5036]: I0110 16:42:47.252131 5036 scope.go:117] "RemoveContainer" containerID="cc21e92489404e4396554aa1d6bb669c52cefa54c05200a2f1d07e692b23f220" Jan 10 16:42:47 crc kubenswrapper[5036]: I0110 16:42:47.339606 5036 scope.go:117] "RemoveContainer" containerID="9232ea53f4be74d0156a8f1a8db28ec9ed79e41a520588a7e3a111dbbac6b8bf" Jan 10 16:42:48 crc kubenswrapper[5036]: I0110 16:42:48.518045 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b614213-d497-4882-9055-dc68dd058b01" path="/var/lib/kubelet/pods/6b614213-d497-4882-9055-dc68dd058b01/volumes" Jan 10 16:42:48 crc kubenswrapper[5036]: I0110 16:42:48.802218 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2","Type":"ContainerStarted","Data":"e9a188f53c0a2a5e3bb3b8ccd9c5f5d409458269c45072e8f7b781d60c809214"} Jan 10 16:42:48 crc kubenswrapper[5036]: I0110 16:42:48.803691 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2","Type":"ContainerStarted","Data":"03802537d8eb649bcc943805c432438d21338e1a175477d4d3dcedc958c85f6f"} Jan 10 16:42:48 crc kubenswrapper[5036]: I0110 16:42:48.805823 5036 generic.go:334] "Generic (PLEG): container finished" podID="65d28afa-c448-4c8a-8fe9-062d9383f484" containerID="6c41ef4497bdcb3a97c8dc3a989fc5f381b9fa6e5d26e44cc0798a11b9261831" exitCode=0 Jan 10 16:42:48 crc kubenswrapper[5036]: I0110 16:42:48.805926 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vsd6b" event={"ID":"65d28afa-c448-4c8a-8fe9-062d9383f484","Type":"ContainerDied","Data":"6c41ef4497bdcb3a97c8dc3a989fc5f381b9fa6e5d26e44cc0798a11b9261831"} Jan 10 16:42:48 crc kubenswrapper[5036]: I0110 16:42:48.808315 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"066ba36b-3da0-4db3-8f19-13e5a5227ab5","Type":"ContainerStarted","Data":"58e86da87c52a31a90eef19bcaba140899ed2e6c0f2ca677fbb617bf03f370c3"} Jan 10 16:42:48 crc kubenswrapper[5036]: I0110 16:42:48.808365 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 10 16:42:48 crc kubenswrapper[5036]: I0110 16:42:48.809770 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"66dcc1cf-f7f9-4064-b019-4ec5f205ea03","Type":"ContainerStarted","Data":"a7e1ab85b95d8b789d27309fd1ee4185b214a81b64b7aba6362608d65d419e36"} Jan 10 16:42:48 crc kubenswrapper[5036]: I0110 16:42:48.810113 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 10 16:42:48 crc kubenswrapper[5036]: I0110 16:42:48.813407 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4f74eaf1-cd39-41dc-8c0a-170373e863e5","Type":"ContainerStarted","Data":"3e48f0eaef58bed6679893fbb09bab83fd4d4bad80e136e51019b27ca6ddb5f2"} Jan 10 16:42:48 crc kubenswrapper[5036]: I0110 16:42:48.815173 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b642befa-dd18-4984-b74f-d3945ee06f7d","Type":"ContainerStarted","Data":"a55b6f9b658799cf9bc71dd6fe53ac95210a40e183c4dcf9ca3d8aed8634b7d7"} Jan 10 16:42:48 crc kubenswrapper[5036]: I0110 16:42:48.816908 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-czqbw" event={"ID":"be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4","Type":"ContainerStarted","Data":"9f14c4c869dc860f4a2391916a1ebfcc0127f802d7faf7a11d0f608cbd7c1dfe"} Jan 10 16:42:48 crc kubenswrapper[5036]: I0110 16:42:48.817256 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-czqbw" Jan 10 16:42:48 crc kubenswrapper[5036]: I0110 16:42:48.847871 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=19.450439535 podStartE2EDuration="33.847854091s" podCreationTimestamp="2026-01-10 16:42:15 +0000 UTC" firstStartedPulling="2026-01-10 16:42:32.739489387 +0000 UTC m=+874.609724881" lastFinishedPulling="2026-01-10 16:42:47.136903943 +0000 UTC m=+889.007139437" observedRunningTime="2026-01-10 16:42:48.847224294 +0000 UTC m=+890.717459808" watchObservedRunningTime="2026-01-10 16:42:48.847854091 +0000 UTC m=+890.718089585" Jan 10 16:42:48 crc kubenswrapper[5036]: I0110 16:42:48.894198 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-czqbw" podStartSLOduration=14.84929617 podStartE2EDuration="28.894181607s" podCreationTimestamp="2026-01-10 16:42:20 +0000 UTC" firstStartedPulling="2026-01-10 16:42:33.093270811 +0000 UTC m=+874.963506305" lastFinishedPulling="2026-01-10 16:42:47.138156248 +0000 UTC m=+889.008391742" observedRunningTime="2026-01-10 16:42:48.891179953 +0000 UTC m=+890.761415457" watchObservedRunningTime="2026-01-10 16:42:48.894181607 +0000 UTC m=+890.764417091" Jan 10 16:42:48 crc kubenswrapper[5036]: I0110 16:42:48.912064 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=17.006663024 podStartE2EDuration="31.912047526s" podCreationTimestamp="2026-01-10 16:42:17 +0000 UTC" firstStartedPulling="2026-01-10 16:42:32.821629434 +0000 UTC m=+874.691864918" lastFinishedPulling="2026-01-10 16:42:47.727013926 +0000 UTC m=+889.597249420" observedRunningTime="2026-01-10 16:42:48.906917123 +0000 UTC m=+890.777152617" watchObservedRunningTime="2026-01-10 16:42:48.912047526 +0000 UTC m=+890.782283010" Jan 10 16:42:49 crc kubenswrapper[5036]: I0110 16:42:49.840461 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vsd6b" event={"ID":"65d28afa-c448-4c8a-8fe9-062d9383f484","Type":"ContainerStarted","Data":"ac42f41c5ef8530533419552cab605c5f857453e43998c77f742eb28fbed4ce0"} Jan 10 16:42:49 crc kubenswrapper[5036]: I0110 16:42:49.840496 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vsd6b" event={"ID":"65d28afa-c448-4c8a-8fe9-062d9383f484","Type":"ContainerStarted","Data":"f7cdf5ebddd22001d2e7e00a103ca2e67c3286a9197209d7a7d7ebc6d1a4c4be"} Jan 10 16:42:49 crc kubenswrapper[5036]: I0110 16:42:49.841929 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-vsd6b" Jan 10 16:42:49 crc kubenswrapper[5036]: I0110 16:42:49.841970 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-vsd6b" Jan 10 16:42:49 crc kubenswrapper[5036]: I0110 16:42:49.885855 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-vsd6b" podStartSLOduration=15.862320059 podStartE2EDuration="29.885836889s" podCreationTimestamp="2026-01-10 16:42:20 +0000 UTC" firstStartedPulling="2026-01-10 16:42:33.019445216 +0000 UTC m=+874.889680710" lastFinishedPulling="2026-01-10 16:42:47.042962036 +0000 UTC m=+888.913197540" observedRunningTime="2026-01-10 16:42:49.883901765 +0000 UTC m=+891.754137269" watchObservedRunningTime="2026-01-10 16:42:49.885836889 +0000 UTC m=+891.756072383" Jan 10 16:42:51 crc kubenswrapper[5036]: I0110 16:42:51.858936 5036 generic.go:334] "Generic (PLEG): container finished" podID="3f624572-bbfe-4c9d-be6f-f8f647fd8aa2" containerID="03802537d8eb649bcc943805c432438d21338e1a175477d4d3dcedc958c85f6f" exitCode=0 Jan 10 16:42:51 crc kubenswrapper[5036]: I0110 16:42:51.859028 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2","Type":"ContainerDied","Data":"03802537d8eb649bcc943805c432438d21338e1a175477d4d3dcedc958c85f6f"} Jan 10 16:42:51 crc kubenswrapper[5036]: I0110 16:42:51.862076 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4f74eaf1-cd39-41dc-8c0a-170373e863e5","Type":"ContainerStarted","Data":"4b3fe8193633456cbda93951cebc53553e4e002b730b3346f313e603fff0a8fe"} Jan 10 16:42:51 crc kubenswrapper[5036]: I0110 16:42:51.864103 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b642befa-dd18-4984-b74f-d3945ee06f7d","Type":"ContainerStarted","Data":"d5413870c1734639b3c4af27434b2541ca8a95be42aaf45ea7943821ee9ec2bf"} Jan 10 16:42:51 crc kubenswrapper[5036]: I0110 16:42:51.866106 5036 generic.go:334] "Generic (PLEG): container finished" podID="78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2" containerID="e9a188f53c0a2a5e3bb3b8ccd9c5f5d409458269c45072e8f7b781d60c809214" exitCode=0 Jan 10 16:42:51 crc kubenswrapper[5036]: I0110 16:42:51.866214 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2","Type":"ContainerDied","Data":"e9a188f53c0a2a5e3bb3b8ccd9c5f5d409458269c45072e8f7b781d60c809214"} Jan 10 16:42:51 crc kubenswrapper[5036]: I0110 16:42:51.941943 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=10.835255241 podStartE2EDuration="28.941923007s" podCreationTimestamp="2026-01-10 16:42:23 +0000 UTC" firstStartedPulling="2026-01-10 16:42:33.114927337 +0000 UTC m=+874.985162841" lastFinishedPulling="2026-01-10 16:42:51.221595113 +0000 UTC m=+893.091830607" observedRunningTime="2026-01-10 16:42:51.935366434 +0000 UTC m=+893.805601938" watchObservedRunningTime="2026-01-10 16:42:51.941923007 +0000 UTC m=+893.812158501" Jan 10 16:42:51 crc kubenswrapper[5036]: I0110 16:42:51.958464 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=13.661825921 podStartE2EDuration="31.958450889s" podCreationTimestamp="2026-01-10 16:42:20 +0000 UTC" firstStartedPulling="2026-01-10 16:42:32.915262443 +0000 UTC m=+874.785497937" lastFinishedPulling="2026-01-10 16:42:51.211887411 +0000 UTC m=+893.082122905" observedRunningTime="2026-01-10 16:42:51.954345935 +0000 UTC m=+893.824581429" watchObservedRunningTime="2026-01-10 16:42:51.958450889 +0000 UTC m=+893.828686383" Jan 10 16:42:52 crc kubenswrapper[5036]: I0110 16:42:52.203119 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:52 crc kubenswrapper[5036]: I0110 16:42:52.203168 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:52 crc kubenswrapper[5036]: I0110 16:42:52.238307 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:52 crc kubenswrapper[5036]: I0110 16:42:52.323317 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:52 crc kubenswrapper[5036]: I0110 16:42:52.358363 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:52 crc kubenswrapper[5036]: I0110 16:42:52.875705 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2","Type":"ContainerStarted","Data":"c8255ef1f9429f80a0efcaf128c682234d2846d8e0cde5b3ccad41dab9c95833"} Jan 10 16:42:52 crc kubenswrapper[5036]: I0110 16:42:52.877903 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3f624572-bbfe-4c9d-be6f-f8f647fd8aa2","Type":"ContainerStarted","Data":"3705c54ceca1ae90255ab742fdea1da39cb971fcf4131d34670b2cdfb34979d4"} Jan 10 16:42:52 crc kubenswrapper[5036]: I0110 16:42:52.878259 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:52 crc kubenswrapper[5036]: I0110 16:42:52.900079 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=24.499167539 podStartE2EDuration="38.900062962s" podCreationTimestamp="2026-01-10 16:42:14 +0000 UTC" firstStartedPulling="2026-01-10 16:42:32.641486857 +0000 UTC m=+874.511722351" lastFinishedPulling="2026-01-10 16:42:47.04238228 +0000 UTC m=+888.912617774" observedRunningTime="2026-01-10 16:42:52.898944741 +0000 UTC m=+894.769180235" watchObservedRunningTime="2026-01-10 16:42:52.900062962 +0000 UTC m=+894.770298446" Jan 10 16:42:52 crc kubenswrapper[5036]: I0110 16:42:52.920761 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 10 16:42:52 crc kubenswrapper[5036]: I0110 16:42:52.924164 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 10 16:42:52 crc kubenswrapper[5036]: I0110 16:42:52.933354 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=25.979784794 podStartE2EDuration="40.933338922s" podCreationTimestamp="2026-01-10 16:42:12 +0000 UTC" firstStartedPulling="2026-01-10 16:42:32.183367386 +0000 UTC m=+874.053602880" lastFinishedPulling="2026-01-10 16:42:47.136921514 +0000 UTC m=+889.007157008" observedRunningTime="2026-01-10 16:42:52.929012831 +0000 UTC m=+894.799248345" watchObservedRunningTime="2026-01-10 16:42:52.933338922 +0000 UTC m=+894.803574416" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.089349 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-xzkz8"] Jan 10 16:42:53 crc kubenswrapper[5036]: E0110 16:42:53.089765 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b614213-d497-4882-9055-dc68dd058b01" containerName="registry-server" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.089784 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b614213-d497-4882-9055-dc68dd058b01" containerName="registry-server" Jan 10 16:42:53 crc kubenswrapper[5036]: E0110 16:42:53.089797 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b614213-d497-4882-9055-dc68dd058b01" containerName="extract-utilities" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.089805 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b614213-d497-4882-9055-dc68dd058b01" containerName="extract-utilities" Jan 10 16:42:53 crc kubenswrapper[5036]: E0110 16:42:53.089820 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b614213-d497-4882-9055-dc68dd058b01" containerName="extract-content" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.089826 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b614213-d497-4882-9055-dc68dd058b01" containerName="extract-content" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.089973 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b614213-d497-4882-9055-dc68dd058b01" containerName="registry-server" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.091388 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-xzkz8" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.097440 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.111566 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-xzkz8"] Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.153901 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-jp5mj"] Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.154928 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-jp5mj" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.159624 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.170276 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-jp5mj"] Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.181630 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0f482ce-10a1-42c2-80f6-60fd28c8cc25-config\") pod \"ovn-controller-metrics-jp5mj\" (UID: \"d0f482ce-10a1-42c2-80f6-60fd28c8cc25\") " pod="openstack/ovn-controller-metrics-jp5mj" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.181739 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f482ce-10a1-42c2-80f6-60fd28c8cc25-combined-ca-bundle\") pod \"ovn-controller-metrics-jp5mj\" (UID: \"d0f482ce-10a1-42c2-80f6-60fd28c8cc25\") " pod="openstack/ovn-controller-metrics-jp5mj" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.181792 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46998ec5-01d0-4bee-8acb-a0f881089396-config\") pod \"dnsmasq-dns-7fd796d7df-xzkz8\" (UID: \"46998ec5-01d0-4bee-8acb-a0f881089396\") " pod="openstack/dnsmasq-dns-7fd796d7df-xzkz8" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.181812 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46998ec5-01d0-4bee-8acb-a0f881089396-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-xzkz8\" (UID: \"46998ec5-01d0-4bee-8acb-a0f881089396\") " pod="openstack/dnsmasq-dns-7fd796d7df-xzkz8" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.181957 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzfhz\" (UniqueName: \"kubernetes.io/projected/46998ec5-01d0-4bee-8acb-a0f881089396-kube-api-access-tzfhz\") pod \"dnsmasq-dns-7fd796d7df-xzkz8\" (UID: \"46998ec5-01d0-4bee-8acb-a0f881089396\") " pod="openstack/dnsmasq-dns-7fd796d7df-xzkz8" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.182013 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d0f482ce-10a1-42c2-80f6-60fd28c8cc25-ovs-rundir\") pod \"ovn-controller-metrics-jp5mj\" (UID: \"d0f482ce-10a1-42c2-80f6-60fd28c8cc25\") " pod="openstack/ovn-controller-metrics-jp5mj" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.182032 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46998ec5-01d0-4bee-8acb-a0f881089396-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-xzkz8\" (UID: \"46998ec5-01d0-4bee-8acb-a0f881089396\") " pod="openstack/dnsmasq-dns-7fd796d7df-xzkz8" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.182050 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d0f482ce-10a1-42c2-80f6-60fd28c8cc25-ovn-rundir\") pod \"ovn-controller-metrics-jp5mj\" (UID: \"d0f482ce-10a1-42c2-80f6-60fd28c8cc25\") " pod="openstack/ovn-controller-metrics-jp5mj" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.182078 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f482ce-10a1-42c2-80f6-60fd28c8cc25-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-jp5mj\" (UID: \"d0f482ce-10a1-42c2-80f6-60fd28c8cc25\") " pod="openstack/ovn-controller-metrics-jp5mj" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.182239 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdnzt\" (UniqueName: \"kubernetes.io/projected/d0f482ce-10a1-42c2-80f6-60fd28c8cc25-kube-api-access-pdnzt\") pod \"ovn-controller-metrics-jp5mj\" (UID: \"d0f482ce-10a1-42c2-80f6-60fd28c8cc25\") " pod="openstack/ovn-controller-metrics-jp5mj" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.215863 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.220308 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.224314 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.224602 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-8gjfz" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.225079 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.225273 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.231109 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.255708 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-xzkz8"] Jan 10 16:42:53 crc kubenswrapper[5036]: E0110 16:42:53.256244 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-tzfhz ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7fd796d7df-xzkz8" podUID="46998ec5-01d0-4bee-8acb-a0f881089396" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.282887 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7p28q"] Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.284087 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.286330 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.287030 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d0f482ce-10a1-42c2-80f6-60fd28c8cc25-ovn-rundir\") pod \"ovn-controller-metrics-jp5mj\" (UID: \"d0f482ce-10a1-42c2-80f6-60fd28c8cc25\") " pod="openstack/ovn-controller-metrics-jp5mj" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.287084 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f482ce-10a1-42c2-80f6-60fd28c8cc25-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-jp5mj\" (UID: \"d0f482ce-10a1-42c2-80f6-60fd28c8cc25\") " pod="openstack/ovn-controller-metrics-jp5mj" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.287121 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d1aa719-1166-4afe-8263-c771aa0a25da-config\") pod \"ovn-northd-0\" (UID: \"1d1aa719-1166-4afe-8263-c771aa0a25da\") " pod="openstack/ovn-northd-0" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.287164 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d1aa719-1166-4afe-8263-c771aa0a25da-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1d1aa719-1166-4afe-8263-c771aa0a25da\") " pod="openstack/ovn-northd-0" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.287274 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d0f482ce-10a1-42c2-80f6-60fd28c8cc25-ovn-rundir\") pod \"ovn-controller-metrics-jp5mj\" (UID: \"d0f482ce-10a1-42c2-80f6-60fd28c8cc25\") " pod="openstack/ovn-controller-metrics-jp5mj" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.287366 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdnzt\" (UniqueName: \"kubernetes.io/projected/d0f482ce-10a1-42c2-80f6-60fd28c8cc25-kube-api-access-pdnzt\") pod \"ovn-controller-metrics-jp5mj\" (UID: \"d0f482ce-10a1-42c2-80f6-60fd28c8cc25\") " pod="openstack/ovn-controller-metrics-jp5mj" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.287406 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j884f\" (UniqueName: \"kubernetes.io/projected/1d1aa719-1166-4afe-8263-c771aa0a25da-kube-api-access-j884f\") pod \"ovn-northd-0\" (UID: \"1d1aa719-1166-4afe-8263-c771aa0a25da\") " pod="openstack/ovn-northd-0" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.287455 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0f482ce-10a1-42c2-80f6-60fd28c8cc25-config\") pod \"ovn-controller-metrics-jp5mj\" (UID: \"d0f482ce-10a1-42c2-80f6-60fd28c8cc25\") " pod="openstack/ovn-controller-metrics-jp5mj" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.287483 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f482ce-10a1-42c2-80f6-60fd28c8cc25-combined-ca-bundle\") pod \"ovn-controller-metrics-jp5mj\" (UID: \"d0f482ce-10a1-42c2-80f6-60fd28c8cc25\") " pod="openstack/ovn-controller-metrics-jp5mj" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.287544 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1aa719-1166-4afe-8263-c771aa0a25da-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1d1aa719-1166-4afe-8263-c771aa0a25da\") " pod="openstack/ovn-northd-0" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.287586 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d1aa719-1166-4afe-8263-c771aa0a25da-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1d1aa719-1166-4afe-8263-c771aa0a25da\") " pod="openstack/ovn-northd-0" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.287612 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d1aa719-1166-4afe-8263-c771aa0a25da-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1d1aa719-1166-4afe-8263-c771aa0a25da\") " pod="openstack/ovn-northd-0" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.287655 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d1aa719-1166-4afe-8263-c771aa0a25da-scripts\") pod \"ovn-northd-0\" (UID: \"1d1aa719-1166-4afe-8263-c771aa0a25da\") " pod="openstack/ovn-northd-0" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.287709 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46998ec5-01d0-4bee-8acb-a0f881089396-config\") pod \"dnsmasq-dns-7fd796d7df-xzkz8\" (UID: \"46998ec5-01d0-4bee-8acb-a0f881089396\") " pod="openstack/dnsmasq-dns-7fd796d7df-xzkz8" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.287745 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46998ec5-01d0-4bee-8acb-a0f881089396-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-xzkz8\" (UID: \"46998ec5-01d0-4bee-8acb-a0f881089396\") " pod="openstack/dnsmasq-dns-7fd796d7df-xzkz8" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.288184 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0f482ce-10a1-42c2-80f6-60fd28c8cc25-config\") pod \"ovn-controller-metrics-jp5mj\" (UID: \"d0f482ce-10a1-42c2-80f6-60fd28c8cc25\") " pod="openstack/ovn-controller-metrics-jp5mj" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.288287 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzfhz\" (UniqueName: \"kubernetes.io/projected/46998ec5-01d0-4bee-8acb-a0f881089396-kube-api-access-tzfhz\") pod \"dnsmasq-dns-7fd796d7df-xzkz8\" (UID: \"46998ec5-01d0-4bee-8acb-a0f881089396\") " pod="openstack/dnsmasq-dns-7fd796d7df-xzkz8" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.288337 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d0f482ce-10a1-42c2-80f6-60fd28c8cc25-ovs-rundir\") pod \"ovn-controller-metrics-jp5mj\" (UID: \"d0f482ce-10a1-42c2-80f6-60fd28c8cc25\") " pod="openstack/ovn-controller-metrics-jp5mj" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.288361 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46998ec5-01d0-4bee-8acb-a0f881089396-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-xzkz8\" (UID: \"46998ec5-01d0-4bee-8acb-a0f881089396\") " pod="openstack/dnsmasq-dns-7fd796d7df-xzkz8" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.288664 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46998ec5-01d0-4bee-8acb-a0f881089396-config\") pod \"dnsmasq-dns-7fd796d7df-xzkz8\" (UID: \"46998ec5-01d0-4bee-8acb-a0f881089396\") " pod="openstack/dnsmasq-dns-7fd796d7df-xzkz8" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.288783 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d0f482ce-10a1-42c2-80f6-60fd28c8cc25-ovs-rundir\") pod \"ovn-controller-metrics-jp5mj\" (UID: \"d0f482ce-10a1-42c2-80f6-60fd28c8cc25\") " pod="openstack/ovn-controller-metrics-jp5mj" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.289245 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46998ec5-01d0-4bee-8acb-a0f881089396-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-xzkz8\" (UID: \"46998ec5-01d0-4bee-8acb-a0f881089396\") " pod="openstack/dnsmasq-dns-7fd796d7df-xzkz8" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.289657 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46998ec5-01d0-4bee-8acb-a0f881089396-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-xzkz8\" (UID: \"46998ec5-01d0-4bee-8acb-a0f881089396\") " pod="openstack/dnsmasq-dns-7fd796d7df-xzkz8" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.290979 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f482ce-10a1-42c2-80f6-60fd28c8cc25-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-jp5mj\" (UID: \"d0f482ce-10a1-42c2-80f6-60fd28c8cc25\") " pod="openstack/ovn-controller-metrics-jp5mj" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.291302 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f482ce-10a1-42c2-80f6-60fd28c8cc25-combined-ca-bundle\") pod \"ovn-controller-metrics-jp5mj\" (UID: \"d0f482ce-10a1-42c2-80f6-60fd28c8cc25\") " pod="openstack/ovn-controller-metrics-jp5mj" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.293798 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7p28q"] Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.312216 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzfhz\" (UniqueName: \"kubernetes.io/projected/46998ec5-01d0-4bee-8acb-a0f881089396-kube-api-access-tzfhz\") pod \"dnsmasq-dns-7fd796d7df-xzkz8\" (UID: \"46998ec5-01d0-4bee-8acb-a0f881089396\") " pod="openstack/dnsmasq-dns-7fd796d7df-xzkz8" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.322822 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdnzt\" (UniqueName: \"kubernetes.io/projected/d0f482ce-10a1-42c2-80f6-60fd28c8cc25-kube-api-access-pdnzt\") pod \"ovn-controller-metrics-jp5mj\" (UID: \"d0f482ce-10a1-42c2-80f6-60fd28c8cc25\") " pod="openstack/ovn-controller-metrics-jp5mj" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.390364 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d1aa719-1166-4afe-8263-c771aa0a25da-scripts\") pod \"ovn-northd-0\" (UID: \"1d1aa719-1166-4afe-8263-c771aa0a25da\") " pod="openstack/ovn-northd-0" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.390456 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba71d9fa-3872-4975-8f46-767f96064411-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-7p28q\" (UID: \"ba71d9fa-3872-4975-8f46-767f96064411\") " pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.390534 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba71d9fa-3872-4975-8f46-767f96064411-config\") pod \"dnsmasq-dns-86db49b7ff-7p28q\" (UID: \"ba71d9fa-3872-4975-8f46-767f96064411\") " pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.390565 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d1aa719-1166-4afe-8263-c771aa0a25da-config\") pod \"ovn-northd-0\" (UID: \"1d1aa719-1166-4afe-8263-c771aa0a25da\") " pod="openstack/ovn-northd-0" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.390738 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d1aa719-1166-4afe-8263-c771aa0a25da-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1d1aa719-1166-4afe-8263-c771aa0a25da\") " pod="openstack/ovn-northd-0" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.390913 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cs7f\" (UniqueName: \"kubernetes.io/projected/ba71d9fa-3872-4975-8f46-767f96064411-kube-api-access-8cs7f\") pod \"dnsmasq-dns-86db49b7ff-7p28q\" (UID: \"ba71d9fa-3872-4975-8f46-767f96064411\") " pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.391039 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba71d9fa-3872-4975-8f46-767f96064411-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-7p28q\" (UID: \"ba71d9fa-3872-4975-8f46-767f96064411\") " pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.391232 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j884f\" (UniqueName: \"kubernetes.io/projected/1d1aa719-1166-4afe-8263-c771aa0a25da-kube-api-access-j884f\") pod \"ovn-northd-0\" (UID: \"1d1aa719-1166-4afe-8263-c771aa0a25da\") " pod="openstack/ovn-northd-0" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.391345 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba71d9fa-3872-4975-8f46-767f96064411-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-7p28q\" (UID: \"ba71d9fa-3872-4975-8f46-767f96064411\") " pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.391422 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1aa719-1166-4afe-8263-c771aa0a25da-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1d1aa719-1166-4afe-8263-c771aa0a25da\") " pod="openstack/ovn-northd-0" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.391465 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d1aa719-1166-4afe-8263-c771aa0a25da-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1d1aa719-1166-4afe-8263-c771aa0a25da\") " pod="openstack/ovn-northd-0" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.391492 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d1aa719-1166-4afe-8263-c771aa0a25da-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1d1aa719-1166-4afe-8263-c771aa0a25da\") " pod="openstack/ovn-northd-0" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.391477 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d1aa719-1166-4afe-8263-c771aa0a25da-scripts\") pod \"ovn-northd-0\" (UID: \"1d1aa719-1166-4afe-8263-c771aa0a25da\") " pod="openstack/ovn-northd-0" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.391624 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d1aa719-1166-4afe-8263-c771aa0a25da-config\") pod \"ovn-northd-0\" (UID: \"1d1aa719-1166-4afe-8263-c771aa0a25da\") " pod="openstack/ovn-northd-0" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.392109 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d1aa719-1166-4afe-8263-c771aa0a25da-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1d1aa719-1166-4afe-8263-c771aa0a25da\") " pod="openstack/ovn-northd-0" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.394761 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1aa719-1166-4afe-8263-c771aa0a25da-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1d1aa719-1166-4afe-8263-c771aa0a25da\") " pod="openstack/ovn-northd-0" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.396307 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d1aa719-1166-4afe-8263-c771aa0a25da-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1d1aa719-1166-4afe-8263-c771aa0a25da\") " pod="openstack/ovn-northd-0" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.405519 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d1aa719-1166-4afe-8263-c771aa0a25da-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1d1aa719-1166-4afe-8263-c771aa0a25da\") " pod="openstack/ovn-northd-0" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.405953 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j884f\" (UniqueName: \"kubernetes.io/projected/1d1aa719-1166-4afe-8263-c771aa0a25da-kube-api-access-j884f\") pod \"ovn-northd-0\" (UID: \"1d1aa719-1166-4afe-8263-c771aa0a25da\") " pod="openstack/ovn-northd-0" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.472148 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-jp5mj" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.493125 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba71d9fa-3872-4975-8f46-767f96064411-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-7p28q\" (UID: \"ba71d9fa-3872-4975-8f46-767f96064411\") " pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.493217 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba71d9fa-3872-4975-8f46-767f96064411-config\") pod \"dnsmasq-dns-86db49b7ff-7p28q\" (UID: \"ba71d9fa-3872-4975-8f46-767f96064411\") " pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.493258 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cs7f\" (UniqueName: \"kubernetes.io/projected/ba71d9fa-3872-4975-8f46-767f96064411-kube-api-access-8cs7f\") pod \"dnsmasq-dns-86db49b7ff-7p28q\" (UID: \"ba71d9fa-3872-4975-8f46-767f96064411\") " pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.493288 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba71d9fa-3872-4975-8f46-767f96064411-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-7p28q\" (UID: \"ba71d9fa-3872-4975-8f46-767f96064411\") " pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.493331 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba71d9fa-3872-4975-8f46-767f96064411-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-7p28q\" (UID: \"ba71d9fa-3872-4975-8f46-767f96064411\") " pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.494518 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba71d9fa-3872-4975-8f46-767f96064411-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-7p28q\" (UID: \"ba71d9fa-3872-4975-8f46-767f96064411\") " pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.494546 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba71d9fa-3872-4975-8f46-767f96064411-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-7p28q\" (UID: \"ba71d9fa-3872-4975-8f46-767f96064411\") " pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.494841 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba71d9fa-3872-4975-8f46-767f96064411-config\") pod \"dnsmasq-dns-86db49b7ff-7p28q\" (UID: \"ba71d9fa-3872-4975-8f46-767f96064411\") " pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.495334 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba71d9fa-3872-4975-8f46-767f96064411-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-7p28q\" (UID: \"ba71d9fa-3872-4975-8f46-767f96064411\") " pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.510897 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cs7f\" (UniqueName: \"kubernetes.io/projected/ba71d9fa-3872-4975-8f46-767f96064411-kube-api-access-8cs7f\") pod \"dnsmasq-dns-86db49b7ff-7p28q\" (UID: \"ba71d9fa-3872-4975-8f46-767f96064411\") " pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.548067 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.669781 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.888308 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-xzkz8" Jan 10 16:42:53 crc kubenswrapper[5036]: I0110 16:42:53.903738 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-xzkz8" Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.000732 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46998ec5-01d0-4bee-8acb-a0f881089396-dns-svc\") pod \"46998ec5-01d0-4bee-8acb-a0f881089396\" (UID: \"46998ec5-01d0-4bee-8acb-a0f881089396\") " Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.001235 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46998ec5-01d0-4bee-8acb-a0f881089396-config\") pod \"46998ec5-01d0-4bee-8acb-a0f881089396\" (UID: \"46998ec5-01d0-4bee-8acb-a0f881089396\") " Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.001888 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46998ec5-01d0-4bee-8acb-a0f881089396-config" (OuterVolumeSpecName: "config") pod "46998ec5-01d0-4bee-8acb-a0f881089396" (UID: "46998ec5-01d0-4bee-8acb-a0f881089396"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.001912 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46998ec5-01d0-4bee-8acb-a0f881089396-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "46998ec5-01d0-4bee-8acb-a0f881089396" (UID: "46998ec5-01d0-4bee-8acb-a0f881089396"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.002999 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46998ec5-01d0-4bee-8acb-a0f881089396-ovsdbserver-nb\") pod \"46998ec5-01d0-4bee-8acb-a0f881089396\" (UID: \"46998ec5-01d0-4bee-8acb-a0f881089396\") " Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.003041 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzfhz\" (UniqueName: \"kubernetes.io/projected/46998ec5-01d0-4bee-8acb-a0f881089396-kube-api-access-tzfhz\") pod \"46998ec5-01d0-4bee-8acb-a0f881089396\" (UID: \"46998ec5-01d0-4bee-8acb-a0f881089396\") " Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.004193 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46998ec5-01d0-4bee-8acb-a0f881089396-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "46998ec5-01d0-4bee-8acb-a0f881089396" (UID: "46998ec5-01d0-4bee-8acb-a0f881089396"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.010024 5036 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46998ec5-01d0-4bee-8acb-a0f881089396-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.010055 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46998ec5-01d0-4bee-8acb-a0f881089396-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.010065 5036 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/46998ec5-01d0-4bee-8acb-a0f881089396-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.014120 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46998ec5-01d0-4bee-8acb-a0f881089396-kube-api-access-tzfhz" (OuterVolumeSpecName: "kube-api-access-tzfhz") pod "46998ec5-01d0-4bee-8acb-a0f881089396" (UID: "46998ec5-01d0-4bee-8acb-a0f881089396"). InnerVolumeSpecName "kube-api-access-tzfhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.112936 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzfhz\" (UniqueName: \"kubernetes.io/projected/46998ec5-01d0-4bee-8acb-a0f881089396-kube-api-access-tzfhz\") on node \"crc\" DevicePath \"\"" Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.168389 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-jp5mj"] Jan 10 16:42:54 crc kubenswrapper[5036]: W0110 16:42:54.170752 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0f482ce_10a1_42c2_80f6_60fd28c8cc25.slice/crio-614c847703ca948e86b5fab441fa4bfd6de5cbeb9da7d18b456dc5497c0ad60c WatchSource:0}: Error finding container 614c847703ca948e86b5fab441fa4bfd6de5cbeb9da7d18b456dc5497c0ad60c: Status 404 returned error can't find the container with id 614c847703ca948e86b5fab441fa4bfd6de5cbeb9da7d18b456dc5497c0ad60c Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.294979 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 10 16:42:54 crc kubenswrapper[5036]: W0110 16:42:54.304815 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d1aa719_1166_4afe_8263_c771aa0a25da.slice/crio-eab8a2710d2239c549d7392bffac1eb1534ece265b23c59a5cccd6d53576d357 WatchSource:0}: Error finding container eab8a2710d2239c549d7392bffac1eb1534ece265b23c59a5cccd6d53576d357: Status 404 returned error can't find the container with id eab8a2710d2239c549d7392bffac1eb1534ece265b23c59a5cccd6d53576d357 Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.346650 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.346981 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.359386 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7p28q"] Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.897342 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-jp5mj" event={"ID":"d0f482ce-10a1-42c2-80f6-60fd28c8cc25","Type":"ContainerStarted","Data":"1e45e92dc5a8eae1cc7e2bf163cbff1498171d6e89d0713efdeb97080b5fb690"} Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.897622 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-jp5mj" event={"ID":"d0f482ce-10a1-42c2-80f6-60fd28c8cc25","Type":"ContainerStarted","Data":"614c847703ca948e86b5fab441fa4bfd6de5cbeb9da7d18b456dc5497c0ad60c"} Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.900529 5036 generic.go:334] "Generic (PLEG): container finished" podID="ba71d9fa-3872-4975-8f46-767f96064411" containerID="f47fc35c910e13afe46c3fd40743e39da0247e4dca1ad5fb4fce4ac8a4c9e339" exitCode=0 Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.900561 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" event={"ID":"ba71d9fa-3872-4975-8f46-767f96064411","Type":"ContainerDied","Data":"f47fc35c910e13afe46c3fd40743e39da0247e4dca1ad5fb4fce4ac8a4c9e339"} Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.900595 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" event={"ID":"ba71d9fa-3872-4975-8f46-767f96064411","Type":"ContainerStarted","Data":"b49b34eb0f028847e2974dba9a1c8267fa741b7e10cc0b972bbf9337017079bd"} Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.903271 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1d1aa719-1166-4afe-8263-c771aa0a25da","Type":"ContainerStarted","Data":"eab8a2710d2239c549d7392bffac1eb1534ece265b23c59a5cccd6d53576d357"} Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.903375 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-xzkz8" Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.924645 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-jp5mj" podStartSLOduration=1.9246274190000001 podStartE2EDuration="1.924627419s" podCreationTimestamp="2026-01-10 16:42:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:42:54.918020024 +0000 UTC m=+896.788255518" watchObservedRunningTime="2026-01-10 16:42:54.924627419 +0000 UTC m=+896.794862913" Jan 10 16:42:54 crc kubenswrapper[5036]: I0110 16:42:54.992048 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-xzkz8"] Jan 10 16:42:55 crc kubenswrapper[5036]: I0110 16:42:55.001495 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-xzkz8"] Jan 10 16:42:55 crc kubenswrapper[5036]: I0110 16:42:55.811978 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:55 crc kubenswrapper[5036]: I0110 16:42:55.812055 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:55 crc kubenswrapper[5036]: I0110 16:42:55.917234 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" event={"ID":"ba71d9fa-3872-4975-8f46-767f96064411","Type":"ContainerStarted","Data":"cc34ed8a49035e5d47d4f6cddd767ffa78c14186e029a057944a0517ff7bd9d7"} Jan 10 16:42:55 crc kubenswrapper[5036]: I0110 16:42:55.917653 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" Jan 10 16:42:55 crc kubenswrapper[5036]: I0110 16:42:55.938541 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" podStartSLOduration=2.938523493 podStartE2EDuration="2.938523493s" podCreationTimestamp="2026-01-10 16:42:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:42:55.933591975 +0000 UTC m=+897.803827469" watchObservedRunningTime="2026-01-10 16:42:55.938523493 +0000 UTC m=+897.808758987" Jan 10 16:42:56 crc kubenswrapper[5036]: I0110 16:42:56.082953 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 10 16:42:56 crc kubenswrapper[5036]: I0110 16:42:56.516669 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46998ec5-01d0-4bee-8acb-a0f881089396" path="/var/lib/kubelet/pods/46998ec5-01d0-4bee-8acb-a0f881089396/volumes" Jan 10 16:42:56 crc kubenswrapper[5036]: I0110 16:42:56.923184 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1d1aa719-1166-4afe-8263-c771aa0a25da","Type":"ContainerStarted","Data":"3b36ff6a01cb25619ede858bd2ec3dc853d0ff94882047a8cbe5575c6e32383f"} Jan 10 16:42:56 crc kubenswrapper[5036]: I0110 16:42:56.923838 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1d1aa719-1166-4afe-8263-c771aa0a25da","Type":"ContainerStarted","Data":"4d686a6fb3b7aa036f19148c1dd8268c6e3c8ad5f6ad65abe12e46437e9fa18e"} Jan 10 16:42:56 crc kubenswrapper[5036]: I0110 16:42:56.950236 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.510443092 podStartE2EDuration="3.950218966s" podCreationTimestamp="2026-01-10 16:42:53 +0000 UTC" firstStartedPulling="2026-01-10 16:42:54.306610906 +0000 UTC m=+896.176846400" lastFinishedPulling="2026-01-10 16:42:55.74638676 +0000 UTC m=+897.616622274" observedRunningTime="2026-01-10 16:42:56.947916851 +0000 UTC m=+898.818152385" watchObservedRunningTime="2026-01-10 16:42:56.950218966 +0000 UTC m=+898.820454460" Jan 10 16:42:57 crc kubenswrapper[5036]: I0110 16:42:57.870633 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 10 16:42:57 crc kubenswrapper[5036]: I0110 16:42:57.929857 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 10 16:42:58 crc kubenswrapper[5036]: I0110 16:42:58.191366 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:58 crc kubenswrapper[5036]: I0110 16:42:58.276993 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 10 16:42:58 crc kubenswrapper[5036]: I0110 16:42:58.627122 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 10 16:42:58 crc kubenswrapper[5036]: I0110 16:42:58.702335 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 10 16:43:03 crc kubenswrapper[5036]: I0110 16:43:03.097369 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-6drbk"] Jan 10 16:43:03 crc kubenswrapper[5036]: I0110 16:43:03.098762 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6drbk" Jan 10 16:43:03 crc kubenswrapper[5036]: I0110 16:43:03.102248 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 10 16:43:03 crc kubenswrapper[5036]: I0110 16:43:03.117339 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-6drbk"] Jan 10 16:43:03 crc kubenswrapper[5036]: I0110 16:43:03.171874 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftj9k\" (UniqueName: \"kubernetes.io/projected/b720948d-ad09-4e83-8451-34e2b039f1d1-kube-api-access-ftj9k\") pod \"root-account-create-update-6drbk\" (UID: \"b720948d-ad09-4e83-8451-34e2b039f1d1\") " pod="openstack/root-account-create-update-6drbk" Jan 10 16:43:03 crc kubenswrapper[5036]: I0110 16:43:03.172399 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b720948d-ad09-4e83-8451-34e2b039f1d1-operator-scripts\") pod \"root-account-create-update-6drbk\" (UID: \"b720948d-ad09-4e83-8451-34e2b039f1d1\") " pod="openstack/root-account-create-update-6drbk" Jan 10 16:43:03 crc kubenswrapper[5036]: I0110 16:43:03.274174 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b720948d-ad09-4e83-8451-34e2b039f1d1-operator-scripts\") pod \"root-account-create-update-6drbk\" (UID: \"b720948d-ad09-4e83-8451-34e2b039f1d1\") " pod="openstack/root-account-create-update-6drbk" Jan 10 16:43:03 crc kubenswrapper[5036]: I0110 16:43:03.274234 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftj9k\" (UniqueName: \"kubernetes.io/projected/b720948d-ad09-4e83-8451-34e2b039f1d1-kube-api-access-ftj9k\") pod \"root-account-create-update-6drbk\" (UID: \"b720948d-ad09-4e83-8451-34e2b039f1d1\") " pod="openstack/root-account-create-update-6drbk" Jan 10 16:43:03 crc kubenswrapper[5036]: I0110 16:43:03.276413 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b720948d-ad09-4e83-8451-34e2b039f1d1-operator-scripts\") pod \"root-account-create-update-6drbk\" (UID: \"b720948d-ad09-4e83-8451-34e2b039f1d1\") " pod="openstack/root-account-create-update-6drbk" Jan 10 16:43:03 crc kubenswrapper[5036]: I0110 16:43:03.297846 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftj9k\" (UniqueName: \"kubernetes.io/projected/b720948d-ad09-4e83-8451-34e2b039f1d1-kube-api-access-ftj9k\") pod \"root-account-create-update-6drbk\" (UID: \"b720948d-ad09-4e83-8451-34e2b039f1d1\") " pod="openstack/root-account-create-update-6drbk" Jan 10 16:43:03 crc kubenswrapper[5036]: I0110 16:43:03.420476 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6drbk" Jan 10 16:43:03 crc kubenswrapper[5036]: I0110 16:43:03.685741 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" Jan 10 16:43:03 crc kubenswrapper[5036]: I0110 16:43:03.762136 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vhkcd"] Jan 10 16:43:03 crc kubenswrapper[5036]: I0110 16:43:03.762731 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-vhkcd" podUID="f4a7f810-b3cd-4699-9ac4-b09e68779b5f" containerName="dnsmasq-dns" containerID="cri-o://ae7e21bf395067153d0c94d4312f3124a22507d499d9989539f1915fdbbdf22d" gracePeriod=10 Jan 10 16:43:03 crc kubenswrapper[5036]: I0110 16:43:03.957919 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-6drbk"] Jan 10 16:43:03 crc kubenswrapper[5036]: I0110 16:43:03.998071 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6drbk" event={"ID":"b720948d-ad09-4e83-8451-34e2b039f1d1","Type":"ContainerStarted","Data":"b3a943189613822ea98cbd70e82e4016f3c11e87ee96fdd14941fdd137d61e66"} Jan 10 16:43:04 crc kubenswrapper[5036]: I0110 16:43:04.000151 5036 generic.go:334] "Generic (PLEG): container finished" podID="f4a7f810-b3cd-4699-9ac4-b09e68779b5f" containerID="ae7e21bf395067153d0c94d4312f3124a22507d499d9989539f1915fdbbdf22d" exitCode=0 Jan 10 16:43:04 crc kubenswrapper[5036]: I0110 16:43:04.000182 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vhkcd" event={"ID":"f4a7f810-b3cd-4699-9ac4-b09e68779b5f","Type":"ContainerDied","Data":"ae7e21bf395067153d0c94d4312f3124a22507d499d9989539f1915fdbbdf22d"} Jan 10 16:43:04 crc kubenswrapper[5036]: I0110 16:43:04.189026 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vhkcd" Jan 10 16:43:04 crc kubenswrapper[5036]: I0110 16:43:04.332099 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqlw4\" (UniqueName: \"kubernetes.io/projected/f4a7f810-b3cd-4699-9ac4-b09e68779b5f-kube-api-access-wqlw4\") pod \"f4a7f810-b3cd-4699-9ac4-b09e68779b5f\" (UID: \"f4a7f810-b3cd-4699-9ac4-b09e68779b5f\") " Jan 10 16:43:04 crc kubenswrapper[5036]: I0110 16:43:04.332267 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4a7f810-b3cd-4699-9ac4-b09e68779b5f-dns-svc\") pod \"f4a7f810-b3cd-4699-9ac4-b09e68779b5f\" (UID: \"f4a7f810-b3cd-4699-9ac4-b09e68779b5f\") " Jan 10 16:43:04 crc kubenswrapper[5036]: I0110 16:43:04.332451 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4a7f810-b3cd-4699-9ac4-b09e68779b5f-config\") pod \"f4a7f810-b3cd-4699-9ac4-b09e68779b5f\" (UID: \"f4a7f810-b3cd-4699-9ac4-b09e68779b5f\") " Jan 10 16:43:04 crc kubenswrapper[5036]: I0110 16:43:04.341062 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4a7f810-b3cd-4699-9ac4-b09e68779b5f-kube-api-access-wqlw4" (OuterVolumeSpecName: "kube-api-access-wqlw4") pod "f4a7f810-b3cd-4699-9ac4-b09e68779b5f" (UID: "f4a7f810-b3cd-4699-9ac4-b09e68779b5f"). InnerVolumeSpecName "kube-api-access-wqlw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:43:04 crc kubenswrapper[5036]: I0110 16:43:04.385426 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4a7f810-b3cd-4699-9ac4-b09e68779b5f-config" (OuterVolumeSpecName: "config") pod "f4a7f810-b3cd-4699-9ac4-b09e68779b5f" (UID: "f4a7f810-b3cd-4699-9ac4-b09e68779b5f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:43:04 crc kubenswrapper[5036]: I0110 16:43:04.402435 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4a7f810-b3cd-4699-9ac4-b09e68779b5f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f4a7f810-b3cd-4699-9ac4-b09e68779b5f" (UID: "f4a7f810-b3cd-4699-9ac4-b09e68779b5f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:43:04 crc kubenswrapper[5036]: I0110 16:43:04.435427 5036 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4a7f810-b3cd-4699-9ac4-b09e68779b5f-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:04 crc kubenswrapper[5036]: I0110 16:43:04.435477 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4a7f810-b3cd-4699-9ac4-b09e68779b5f-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:04 crc kubenswrapper[5036]: I0110 16:43:04.435496 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqlw4\" (UniqueName: \"kubernetes.io/projected/f4a7f810-b3cd-4699-9ac4-b09e68779b5f-kube-api-access-wqlw4\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.009633 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6drbk" event={"ID":"b720948d-ad09-4e83-8451-34e2b039f1d1","Type":"ContainerStarted","Data":"5c58205136eb086db1b39529e4118ad4635d97fc9b6476a5d671853e9e0aa9d9"} Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.011759 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vhkcd" event={"ID":"f4a7f810-b3cd-4699-9ac4-b09e68779b5f","Type":"ContainerDied","Data":"a46185c465a60e891364c9b4fdd46a4c78e127130157686236bedf735eb337af"} Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.011804 5036 scope.go:117] "RemoveContainer" containerID="ae7e21bf395067153d0c94d4312f3124a22507d499d9989539f1915fdbbdf22d" Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.011836 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vhkcd" Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.030302 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-6drbk" podStartSLOduration=2.030283176 podStartE2EDuration="2.030283176s" podCreationTimestamp="2026-01-10 16:43:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:43:05.022508979 +0000 UTC m=+906.892744473" watchObservedRunningTime="2026-01-10 16:43:05.030283176 +0000 UTC m=+906.900518670" Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.043038 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vhkcd"] Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.049727 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vhkcd"] Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.679939 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-2hht5"] Jan 10 16:43:05 crc kubenswrapper[5036]: E0110 16:43:05.680284 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a7f810-b3cd-4699-9ac4-b09e68779b5f" containerName="init" Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.680301 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a7f810-b3cd-4699-9ac4-b09e68779b5f" containerName="init" Jan 10 16:43:05 crc kubenswrapper[5036]: E0110 16:43:05.680364 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a7f810-b3cd-4699-9ac4-b09e68779b5f" containerName="dnsmasq-dns" Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.680373 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a7f810-b3cd-4699-9ac4-b09e68779b5f" containerName="dnsmasq-dns" Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.680553 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4a7f810-b3cd-4699-9ac4-b09e68779b5f" containerName="dnsmasq-dns" Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.681246 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2hht5" Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.690270 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2hht5"] Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.756592 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62-operator-scripts\") pod \"keystone-db-create-2hht5\" (UID: \"aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62\") " pod="openstack/keystone-db-create-2hht5" Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.756712 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wccl\" (UniqueName: \"kubernetes.io/projected/aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62-kube-api-access-7wccl\") pod \"keystone-db-create-2hht5\" (UID: \"aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62\") " pod="openstack/keystone-db-create-2hht5" Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.784696 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-67ce-account-create-update-t7sxt"] Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.786486 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-67ce-account-create-update-t7sxt" Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.788763 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.792184 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-67ce-account-create-update-t7sxt"] Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.858073 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62-operator-scripts\") pod \"keystone-db-create-2hht5\" (UID: \"aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62\") " pod="openstack/keystone-db-create-2hht5" Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.858160 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4j2r\" (UniqueName: \"kubernetes.io/projected/4c0cdec2-be1f-4169-ade6-cf65905c7003-kube-api-access-x4j2r\") pod \"keystone-67ce-account-create-update-t7sxt\" (UID: \"4c0cdec2-be1f-4169-ade6-cf65905c7003\") " pod="openstack/keystone-67ce-account-create-update-t7sxt" Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.858204 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c0cdec2-be1f-4169-ade6-cf65905c7003-operator-scripts\") pod \"keystone-67ce-account-create-update-t7sxt\" (UID: \"4c0cdec2-be1f-4169-ade6-cf65905c7003\") " pod="openstack/keystone-67ce-account-create-update-t7sxt" Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.858448 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wccl\" (UniqueName: \"kubernetes.io/projected/aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62-kube-api-access-7wccl\") pod \"keystone-db-create-2hht5\" (UID: \"aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62\") " pod="openstack/keystone-db-create-2hht5" Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.858888 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62-operator-scripts\") pod \"keystone-db-create-2hht5\" (UID: \"aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62\") " pod="openstack/keystone-db-create-2hht5" Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.880845 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wccl\" (UniqueName: \"kubernetes.io/projected/aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62-kube-api-access-7wccl\") pod \"keystone-db-create-2hht5\" (UID: \"aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62\") " pod="openstack/keystone-db-create-2hht5" Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.959755 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4j2r\" (UniqueName: \"kubernetes.io/projected/4c0cdec2-be1f-4169-ade6-cf65905c7003-kube-api-access-x4j2r\") pod \"keystone-67ce-account-create-update-t7sxt\" (UID: \"4c0cdec2-be1f-4169-ade6-cf65905c7003\") " pod="openstack/keystone-67ce-account-create-update-t7sxt" Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.959817 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c0cdec2-be1f-4169-ade6-cf65905c7003-operator-scripts\") pod \"keystone-67ce-account-create-update-t7sxt\" (UID: \"4c0cdec2-be1f-4169-ade6-cf65905c7003\") " pod="openstack/keystone-67ce-account-create-update-t7sxt" Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.961514 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c0cdec2-be1f-4169-ade6-cf65905c7003-operator-scripts\") pod \"keystone-67ce-account-create-update-t7sxt\" (UID: \"4c0cdec2-be1f-4169-ade6-cf65905c7003\") " pod="openstack/keystone-67ce-account-create-update-t7sxt" Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.972764 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-bzzkq"] Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.974052 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-bzzkq" Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.979996 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-bzzkq"] Jan 10 16:43:05 crc kubenswrapper[5036]: I0110 16:43:05.983407 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4j2r\" (UniqueName: \"kubernetes.io/projected/4c0cdec2-be1f-4169-ade6-cf65905c7003-kube-api-access-x4j2r\") pod \"keystone-67ce-account-create-update-t7sxt\" (UID: \"4c0cdec2-be1f-4169-ade6-cf65905c7003\") " pod="openstack/keystone-67ce-account-create-update-t7sxt" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.005399 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2hht5" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.061067 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlrlw\" (UniqueName: \"kubernetes.io/projected/d206d6b1-be89-44b7-a4db-749bd0113be2-kube-api-access-qlrlw\") pod \"placement-db-create-bzzkq\" (UID: \"d206d6b1-be89-44b7-a4db-749bd0113be2\") " pod="openstack/placement-db-create-bzzkq" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.061393 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d206d6b1-be89-44b7-a4db-749bd0113be2-operator-scripts\") pod \"placement-db-create-bzzkq\" (UID: \"d206d6b1-be89-44b7-a4db-749bd0113be2\") " pod="openstack/placement-db-create-bzzkq" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.102177 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-0f05-account-create-update-vfvzx"] Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.103923 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0f05-account-create-update-vfvzx" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.105533 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.112492 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0f05-account-create-update-vfvzx"] Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.147894 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-67ce-account-create-update-t7sxt" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.162504 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlrlw\" (UniqueName: \"kubernetes.io/projected/d206d6b1-be89-44b7-a4db-749bd0113be2-kube-api-access-qlrlw\") pod \"placement-db-create-bzzkq\" (UID: \"d206d6b1-be89-44b7-a4db-749bd0113be2\") " pod="openstack/placement-db-create-bzzkq" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.162879 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d206d6b1-be89-44b7-a4db-749bd0113be2-operator-scripts\") pod \"placement-db-create-bzzkq\" (UID: \"d206d6b1-be89-44b7-a4db-749bd0113be2\") " pod="openstack/placement-db-create-bzzkq" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.163534 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d206d6b1-be89-44b7-a4db-749bd0113be2-operator-scripts\") pod \"placement-db-create-bzzkq\" (UID: \"d206d6b1-be89-44b7-a4db-749bd0113be2\") " pod="openstack/placement-db-create-bzzkq" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.183254 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlrlw\" (UniqueName: \"kubernetes.io/projected/d206d6b1-be89-44b7-a4db-749bd0113be2-kube-api-access-qlrlw\") pod \"placement-db-create-bzzkq\" (UID: \"d206d6b1-be89-44b7-a4db-749bd0113be2\") " pod="openstack/placement-db-create-bzzkq" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.209063 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-kmjz4"] Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.210377 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kmjz4" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.216376 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-kmjz4"] Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.264635 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90b5d891-59ef-43ca-9689-ec2f4bfa590c-operator-scripts\") pod \"placement-0f05-account-create-update-vfvzx\" (UID: \"90b5d891-59ef-43ca-9689-ec2f4bfa590c\") " pod="openstack/placement-0f05-account-create-update-vfvzx" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.265518 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5vdt\" (UniqueName: \"kubernetes.io/projected/90b5d891-59ef-43ca-9689-ec2f4bfa590c-kube-api-access-r5vdt\") pod \"placement-0f05-account-create-update-vfvzx\" (UID: \"90b5d891-59ef-43ca-9689-ec2f4bfa590c\") " pod="openstack/placement-0f05-account-create-update-vfvzx" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.297323 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-813f-account-create-update-m5fwq"] Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.298442 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-813f-account-create-update-m5fwq" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.303023 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.307043 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-813f-account-create-update-m5fwq"] Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.334159 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-bzzkq" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.367783 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b0a63f9-1828-482a-a1f1-d99bf4b8932e-operator-scripts\") pod \"glance-db-create-kmjz4\" (UID: \"8b0a63f9-1828-482a-a1f1-d99bf4b8932e\") " pod="openstack/glance-db-create-kmjz4" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.367843 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptzq9\" (UniqueName: \"kubernetes.io/projected/edca4de3-92c0-449a-a081-8868ada61ff8-kube-api-access-ptzq9\") pod \"glance-813f-account-create-update-m5fwq\" (UID: \"edca4de3-92c0-449a-a081-8868ada61ff8\") " pod="openstack/glance-813f-account-create-update-m5fwq" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.367916 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5vdt\" (UniqueName: \"kubernetes.io/projected/90b5d891-59ef-43ca-9689-ec2f4bfa590c-kube-api-access-r5vdt\") pod \"placement-0f05-account-create-update-vfvzx\" (UID: \"90b5d891-59ef-43ca-9689-ec2f4bfa590c\") " pod="openstack/placement-0f05-account-create-update-vfvzx" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.367957 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgcm7\" (UniqueName: \"kubernetes.io/projected/8b0a63f9-1828-482a-a1f1-d99bf4b8932e-kube-api-access-vgcm7\") pod \"glance-db-create-kmjz4\" (UID: \"8b0a63f9-1828-482a-a1f1-d99bf4b8932e\") " pod="openstack/glance-db-create-kmjz4" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.367989 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edca4de3-92c0-449a-a081-8868ada61ff8-operator-scripts\") pod \"glance-813f-account-create-update-m5fwq\" (UID: \"edca4de3-92c0-449a-a081-8868ada61ff8\") " pod="openstack/glance-813f-account-create-update-m5fwq" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.368019 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90b5d891-59ef-43ca-9689-ec2f4bfa590c-operator-scripts\") pod \"placement-0f05-account-create-update-vfvzx\" (UID: \"90b5d891-59ef-43ca-9689-ec2f4bfa590c\") " pod="openstack/placement-0f05-account-create-update-vfvzx" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.371855 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90b5d891-59ef-43ca-9689-ec2f4bfa590c-operator-scripts\") pod \"placement-0f05-account-create-update-vfvzx\" (UID: \"90b5d891-59ef-43ca-9689-ec2f4bfa590c\") " pod="openstack/placement-0f05-account-create-update-vfvzx" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.383869 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5vdt\" (UniqueName: \"kubernetes.io/projected/90b5d891-59ef-43ca-9689-ec2f4bfa590c-kube-api-access-r5vdt\") pod \"placement-0f05-account-create-update-vfvzx\" (UID: \"90b5d891-59ef-43ca-9689-ec2f4bfa590c\") " pod="openstack/placement-0f05-account-create-update-vfvzx" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.419318 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0f05-account-create-update-vfvzx" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.455220 5036 scope.go:117] "RemoveContainer" containerID="47b9a7ad1c5fce0bfe89393197d8537afd219923643fdbc434656da54dadbe16" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.468613 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptzq9\" (UniqueName: \"kubernetes.io/projected/edca4de3-92c0-449a-a081-8868ada61ff8-kube-api-access-ptzq9\") pod \"glance-813f-account-create-update-m5fwq\" (UID: \"edca4de3-92c0-449a-a081-8868ada61ff8\") " pod="openstack/glance-813f-account-create-update-m5fwq" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.468753 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgcm7\" (UniqueName: \"kubernetes.io/projected/8b0a63f9-1828-482a-a1f1-d99bf4b8932e-kube-api-access-vgcm7\") pod \"glance-db-create-kmjz4\" (UID: \"8b0a63f9-1828-482a-a1f1-d99bf4b8932e\") " pod="openstack/glance-db-create-kmjz4" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.468816 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edca4de3-92c0-449a-a081-8868ada61ff8-operator-scripts\") pod \"glance-813f-account-create-update-m5fwq\" (UID: \"edca4de3-92c0-449a-a081-8868ada61ff8\") " pod="openstack/glance-813f-account-create-update-m5fwq" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.468935 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b0a63f9-1828-482a-a1f1-d99bf4b8932e-operator-scripts\") pod \"glance-db-create-kmjz4\" (UID: \"8b0a63f9-1828-482a-a1f1-d99bf4b8932e\") " pod="openstack/glance-db-create-kmjz4" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.469865 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b0a63f9-1828-482a-a1f1-d99bf4b8932e-operator-scripts\") pod \"glance-db-create-kmjz4\" (UID: \"8b0a63f9-1828-482a-a1f1-d99bf4b8932e\") " pod="openstack/glance-db-create-kmjz4" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.469879 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edca4de3-92c0-449a-a081-8868ada61ff8-operator-scripts\") pod \"glance-813f-account-create-update-m5fwq\" (UID: \"edca4de3-92c0-449a-a081-8868ada61ff8\") " pod="openstack/glance-813f-account-create-update-m5fwq" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.492204 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptzq9\" (UniqueName: \"kubernetes.io/projected/edca4de3-92c0-449a-a081-8868ada61ff8-kube-api-access-ptzq9\") pod \"glance-813f-account-create-update-m5fwq\" (UID: \"edca4de3-92c0-449a-a081-8868ada61ff8\") " pod="openstack/glance-813f-account-create-update-m5fwq" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.494152 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgcm7\" (UniqueName: \"kubernetes.io/projected/8b0a63f9-1828-482a-a1f1-d99bf4b8932e-kube-api-access-vgcm7\") pod \"glance-db-create-kmjz4\" (UID: \"8b0a63f9-1828-482a-a1f1-d99bf4b8932e\") " pod="openstack/glance-db-create-kmjz4" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.518886 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4a7f810-b3cd-4699-9ac4-b09e68779b5f" path="/var/lib/kubelet/pods/f4a7f810-b3cd-4699-9ac4-b09e68779b5f/volumes" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.526631 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kmjz4" Jan 10 16:43:06 crc kubenswrapper[5036]: I0110 16:43:06.631081 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-813f-account-create-update-m5fwq" Jan 10 16:43:07 crc kubenswrapper[5036]: I0110 16:43:07.278596 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0f05-account-create-update-vfvzx"] Jan 10 16:43:07 crc kubenswrapper[5036]: I0110 16:43:07.373254 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-kmjz4"] Jan 10 16:43:07 crc kubenswrapper[5036]: W0110 16:43:07.376456 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b0a63f9_1828_482a_a1f1_d99bf4b8932e.slice/crio-b10cc49a446b3a848b317e6f7ae64a7501f609f446fcd428e86884ceb1c1ceb3 WatchSource:0}: Error finding container b10cc49a446b3a848b317e6f7ae64a7501f609f446fcd428e86884ceb1c1ceb3: Status 404 returned error can't find the container with id b10cc49a446b3a848b317e6f7ae64a7501f609f446fcd428e86884ceb1c1ceb3 Jan 10 16:43:07 crc kubenswrapper[5036]: W0110 16:43:07.378590 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedca4de3_92c0_449a_a081_8868ada61ff8.slice/crio-7eee502f1f76a1ab444f191e6fe33d6f56403d27205a9c492743282f37413667 WatchSource:0}: Error finding container 7eee502f1f76a1ab444f191e6fe33d6f56403d27205a9c492743282f37413667: Status 404 returned error can't find the container with id 7eee502f1f76a1ab444f191e6fe33d6f56403d27205a9c492743282f37413667 Jan 10 16:43:07 crc kubenswrapper[5036]: I0110 16:43:07.379529 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-813f-account-create-update-m5fwq"] Jan 10 16:43:07 crc kubenswrapper[5036]: W0110 16:43:07.385016 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd206d6b1_be89_44b7_a4db_749bd0113be2.slice/crio-f6f3f3c91975a154781d427a77e663a90f9c5a31ee53f121172861142133d01d WatchSource:0}: Error finding container f6f3f3c91975a154781d427a77e663a90f9c5a31ee53f121172861142133d01d: Status 404 returned error can't find the container with id f6f3f3c91975a154781d427a77e663a90f9c5a31ee53f121172861142133d01d Jan 10 16:43:07 crc kubenswrapper[5036]: I0110 16:43:07.389829 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-bzzkq"] Jan 10 16:43:07 crc kubenswrapper[5036]: I0110 16:43:07.572350 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2hht5"] Jan 10 16:43:07 crc kubenswrapper[5036]: W0110 16:43:07.575788 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeb3f8c5_2ecf_47d7_9ef3_4550ed574b62.slice/crio-9adde9218c85ca7fcca0c07a51aa05a0fb52e5b3ae1e6b49424c9712921be442 WatchSource:0}: Error finding container 9adde9218c85ca7fcca0c07a51aa05a0fb52e5b3ae1e6b49424c9712921be442: Status 404 returned error can't find the container with id 9adde9218c85ca7fcca0c07a51aa05a0fb52e5b3ae1e6b49424c9712921be442 Jan 10 16:43:07 crc kubenswrapper[5036]: I0110 16:43:07.579499 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-67ce-account-create-update-t7sxt"] Jan 10 16:43:08 crc kubenswrapper[5036]: I0110 16:43:08.048405 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kmjz4" event={"ID":"8b0a63f9-1828-482a-a1f1-d99bf4b8932e","Type":"ContainerStarted","Data":"b10cc49a446b3a848b317e6f7ae64a7501f609f446fcd428e86884ceb1c1ceb3"} Jan 10 16:43:08 crc kubenswrapper[5036]: I0110 16:43:08.051365 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-67ce-account-create-update-t7sxt" event={"ID":"4c0cdec2-be1f-4169-ade6-cf65905c7003","Type":"ContainerStarted","Data":"41ccba2ff9cd28075be02266cbf8dffd90dd56c198696d2efae0d636b58a22cf"} Jan 10 16:43:08 crc kubenswrapper[5036]: I0110 16:43:08.053957 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0f05-account-create-update-vfvzx" event={"ID":"90b5d891-59ef-43ca-9689-ec2f4bfa590c","Type":"ContainerStarted","Data":"53dffe79922877b0635494b33a5f44dc9838d913e074771dd563355c8121fc59"} Jan 10 16:43:08 crc kubenswrapper[5036]: I0110 16:43:08.056229 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-813f-account-create-update-m5fwq" event={"ID":"edca4de3-92c0-449a-a081-8868ada61ff8","Type":"ContainerStarted","Data":"7eee502f1f76a1ab444f191e6fe33d6f56403d27205a9c492743282f37413667"} Jan 10 16:43:08 crc kubenswrapper[5036]: I0110 16:43:08.058071 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-bzzkq" event={"ID":"d206d6b1-be89-44b7-a4db-749bd0113be2","Type":"ContainerStarted","Data":"f6f3f3c91975a154781d427a77e663a90f9c5a31ee53f121172861142133d01d"} Jan 10 16:43:08 crc kubenswrapper[5036]: I0110 16:43:08.060069 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2hht5" event={"ID":"aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62","Type":"ContainerStarted","Data":"9adde9218c85ca7fcca0c07a51aa05a0fb52e5b3ae1e6b49424c9712921be442"} Jan 10 16:43:08 crc kubenswrapper[5036]: I0110 16:43:08.700986 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 10 16:43:09 crc kubenswrapper[5036]: I0110 16:43:09.070953 5036 generic.go:334] "Generic (PLEG): container finished" podID="b720948d-ad09-4e83-8451-34e2b039f1d1" containerID="5c58205136eb086db1b39529e4118ad4635d97fc9b6476a5d671853e9e0aa9d9" exitCode=0 Jan 10 16:43:09 crc kubenswrapper[5036]: I0110 16:43:09.071227 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6drbk" event={"ID":"b720948d-ad09-4e83-8451-34e2b039f1d1","Type":"ContainerDied","Data":"5c58205136eb086db1b39529e4118ad4635d97fc9b6476a5d671853e9e0aa9d9"} Jan 10 16:43:09 crc kubenswrapper[5036]: I0110 16:43:09.073495 5036 generic.go:334] "Generic (PLEG): container finished" podID="8b0a63f9-1828-482a-a1f1-d99bf4b8932e" containerID="ee74419f111dba9520df1b18aafc859438c57e7070c97f35ec1de30560dddddd" exitCode=0 Jan 10 16:43:09 crc kubenswrapper[5036]: I0110 16:43:09.073613 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kmjz4" event={"ID":"8b0a63f9-1828-482a-a1f1-d99bf4b8932e","Type":"ContainerDied","Data":"ee74419f111dba9520df1b18aafc859438c57e7070c97f35ec1de30560dddddd"} Jan 10 16:43:09 crc kubenswrapper[5036]: I0110 16:43:09.074755 5036 generic.go:334] "Generic (PLEG): container finished" podID="4c0cdec2-be1f-4169-ade6-cf65905c7003" containerID="461ecf8322edf244c214d8b7efa3681c532d9083bb446e80b059a7e71741f1f8" exitCode=0 Jan 10 16:43:09 crc kubenswrapper[5036]: I0110 16:43:09.074798 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-67ce-account-create-update-t7sxt" event={"ID":"4c0cdec2-be1f-4169-ade6-cf65905c7003","Type":"ContainerDied","Data":"461ecf8322edf244c214d8b7efa3681c532d9083bb446e80b059a7e71741f1f8"} Jan 10 16:43:09 crc kubenswrapper[5036]: I0110 16:43:09.076306 5036 generic.go:334] "Generic (PLEG): container finished" podID="90b5d891-59ef-43ca-9689-ec2f4bfa590c" containerID="50a713c8fd8d05f221fc8976bf3b1bb4574a5a58edb623c6b0ac8033165cbd5e" exitCode=0 Jan 10 16:43:09 crc kubenswrapper[5036]: I0110 16:43:09.076399 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0f05-account-create-update-vfvzx" event={"ID":"90b5d891-59ef-43ca-9689-ec2f4bfa590c","Type":"ContainerDied","Data":"50a713c8fd8d05f221fc8976bf3b1bb4574a5a58edb623c6b0ac8033165cbd5e"} Jan 10 16:43:09 crc kubenswrapper[5036]: I0110 16:43:09.077499 5036 generic.go:334] "Generic (PLEG): container finished" podID="8146d758-62d6-4640-86f8-51b89a8a8519" containerID="140d035c5adeb766202b21920371d68bd36600fed05e2465e654516163e8857e" exitCode=0 Jan 10 16:43:09 crc kubenswrapper[5036]: I0110 16:43:09.077553 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8146d758-62d6-4640-86f8-51b89a8a8519","Type":"ContainerDied","Data":"140d035c5adeb766202b21920371d68bd36600fed05e2465e654516163e8857e"} Jan 10 16:43:09 crc kubenswrapper[5036]: I0110 16:43:09.078701 5036 generic.go:334] "Generic (PLEG): container finished" podID="edca4de3-92c0-449a-a081-8868ada61ff8" containerID="373c35ab5e490fef737c84e596ab3610712a07ef15c8f5f8d1cf213c02dbb2d2" exitCode=0 Jan 10 16:43:09 crc kubenswrapper[5036]: I0110 16:43:09.078753 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-813f-account-create-update-m5fwq" event={"ID":"edca4de3-92c0-449a-a081-8868ada61ff8","Type":"ContainerDied","Data":"373c35ab5e490fef737c84e596ab3610712a07ef15c8f5f8d1cf213c02dbb2d2"} Jan 10 16:43:09 crc kubenswrapper[5036]: I0110 16:43:09.085130 5036 generic.go:334] "Generic (PLEG): container finished" podID="d206d6b1-be89-44b7-a4db-749bd0113be2" containerID="e8a700359f53490edb5a795d09be5fdbb709b849b02a9de930eb78d0f9537c7e" exitCode=0 Jan 10 16:43:09 crc kubenswrapper[5036]: I0110 16:43:09.085355 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-bzzkq" event={"ID":"d206d6b1-be89-44b7-a4db-749bd0113be2","Type":"ContainerDied","Data":"e8a700359f53490edb5a795d09be5fdbb709b849b02a9de930eb78d0f9537c7e"} Jan 10 16:43:09 crc kubenswrapper[5036]: I0110 16:43:09.087528 5036 generic.go:334] "Generic (PLEG): container finished" podID="aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62" containerID="551ef618555643988bdae04127251fe941ba1ffb152c11e816b8bcd85bdf42f4" exitCode=0 Jan 10 16:43:09 crc kubenswrapper[5036]: I0110 16:43:09.087566 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2hht5" event={"ID":"aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62","Type":"ContainerDied","Data":"551ef618555643988bdae04127251fe941ba1ffb152c11e816b8bcd85bdf42f4"} Jan 10 16:43:10 crc kubenswrapper[5036]: I0110 16:43:10.099201 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8146d758-62d6-4640-86f8-51b89a8a8519","Type":"ContainerStarted","Data":"f99399c64a79a6f43277fe41635e3723399ec561b2e34ae459cf787a81c219b7"} Jan 10 16:43:10 crc kubenswrapper[5036]: I0110 16:43:10.099467 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:43:10 crc kubenswrapper[5036]: I0110 16:43:10.101129 5036 generic.go:334] "Generic (PLEG): container finished" podID="cd708bfb-a557-401f-b815-16d584c8eb78" containerID="40a904c4742ed367fe558a46a911f8146836480ee5820dd2aeb7d14fee4a18f4" exitCode=0 Jan 10 16:43:10 crc kubenswrapper[5036]: I0110 16:43:10.101362 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cd708bfb-a557-401f-b815-16d584c8eb78","Type":"ContainerDied","Data":"40a904c4742ed367fe558a46a911f8146836480ee5820dd2aeb7d14fee4a18f4"} Jan 10 16:43:10 crc kubenswrapper[5036]: I0110 16:43:10.180885 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=47.632995898 podStartE2EDuration="59.180863588s" podCreationTimestamp="2026-01-10 16:42:11 +0000 UTC" firstStartedPulling="2026-01-10 16:42:21.005861743 +0000 UTC m=+862.876097227" lastFinishedPulling="2026-01-10 16:42:32.553729423 +0000 UTC m=+874.423964917" observedRunningTime="2026-01-10 16:43:10.175790094 +0000 UTC m=+912.046025598" watchObservedRunningTime="2026-01-10 16:43:10.180863588 +0000 UTC m=+912.051099092" Jan 10 16:43:10 crc kubenswrapper[5036]: I0110 16:43:10.687029 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0f05-account-create-update-vfvzx" Jan 10 16:43:10 crc kubenswrapper[5036]: I0110 16:43:10.690711 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kmjz4" Jan 10 16:43:10 crc kubenswrapper[5036]: I0110 16:43:10.699498 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2hht5" Jan 10 16:43:10 crc kubenswrapper[5036]: I0110 16:43:10.737860 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90b5d891-59ef-43ca-9689-ec2f4bfa590c-operator-scripts\") pod \"90b5d891-59ef-43ca-9689-ec2f4bfa590c\" (UID: \"90b5d891-59ef-43ca-9689-ec2f4bfa590c\") " Jan 10 16:43:10 crc kubenswrapper[5036]: I0110 16:43:10.737938 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5vdt\" (UniqueName: \"kubernetes.io/projected/90b5d891-59ef-43ca-9689-ec2f4bfa590c-kube-api-access-r5vdt\") pod \"90b5d891-59ef-43ca-9689-ec2f4bfa590c\" (UID: \"90b5d891-59ef-43ca-9689-ec2f4bfa590c\") " Jan 10 16:43:10 crc kubenswrapper[5036]: I0110 16:43:10.747245 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90b5d891-59ef-43ca-9689-ec2f4bfa590c-kube-api-access-r5vdt" (OuterVolumeSpecName: "kube-api-access-r5vdt") pod "90b5d891-59ef-43ca-9689-ec2f4bfa590c" (UID: "90b5d891-59ef-43ca-9689-ec2f4bfa590c"). InnerVolumeSpecName "kube-api-access-r5vdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:43:10 crc kubenswrapper[5036]: I0110 16:43:10.749562 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90b5d891-59ef-43ca-9689-ec2f4bfa590c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "90b5d891-59ef-43ca-9689-ec2f4bfa590c" (UID: "90b5d891-59ef-43ca-9689-ec2f4bfa590c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.156745 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgcm7\" (UniqueName: \"kubernetes.io/projected/8b0a63f9-1828-482a-a1f1-d99bf4b8932e-kube-api-access-vgcm7\") pod \"8b0a63f9-1828-482a-a1f1-d99bf4b8932e\" (UID: \"8b0a63f9-1828-482a-a1f1-d99bf4b8932e\") " Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.156874 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wccl\" (UniqueName: \"kubernetes.io/projected/aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62-kube-api-access-7wccl\") pod \"aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62\" (UID: \"aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62\") " Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.156914 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62-operator-scripts\") pod \"aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62\" (UID: \"aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62\") " Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.156967 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b0a63f9-1828-482a-a1f1-d99bf4b8932e-operator-scripts\") pod \"8b0a63f9-1828-482a-a1f1-d99bf4b8932e\" (UID: \"8b0a63f9-1828-482a-a1f1-d99bf4b8932e\") " Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.157376 5036 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90b5d891-59ef-43ca-9689-ec2f4bfa590c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.157390 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5vdt\" (UniqueName: \"kubernetes.io/projected/90b5d891-59ef-43ca-9689-ec2f4bfa590c-kube-api-access-r5vdt\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.157858 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b0a63f9-1828-482a-a1f1-d99bf4b8932e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b0a63f9-1828-482a-a1f1-d99bf4b8932e" (UID: "8b0a63f9-1828-482a-a1f1-d99bf4b8932e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.158420 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62" (UID: "aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.163166 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62-kube-api-access-7wccl" (OuterVolumeSpecName: "kube-api-access-7wccl") pod "aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62" (UID: "aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62"). InnerVolumeSpecName "kube-api-access-7wccl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.163572 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b0a63f9-1828-482a-a1f1-d99bf4b8932e-kube-api-access-vgcm7" (OuterVolumeSpecName: "kube-api-access-vgcm7") pod "8b0a63f9-1828-482a-a1f1-d99bf4b8932e" (UID: "8b0a63f9-1828-482a-a1f1-d99bf4b8932e"). InnerVolumeSpecName "kube-api-access-vgcm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.187320 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-813f-account-create-update-m5fwq" event={"ID":"edca4de3-92c0-449a-a081-8868ada61ff8","Type":"ContainerDied","Data":"7eee502f1f76a1ab444f191e6fe33d6f56403d27205a9c492743282f37413667"} Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.187354 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7eee502f1f76a1ab444f191e6fe33d6f56403d27205a9c492743282f37413667" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.188891 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2hht5" event={"ID":"aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62","Type":"ContainerDied","Data":"9adde9218c85ca7fcca0c07a51aa05a0fb52e5b3ae1e6b49424c9712921be442"} Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.188909 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9adde9218c85ca7fcca0c07a51aa05a0fb52e5b3ae1e6b49424c9712921be442" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.188951 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2hht5" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.193856 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kmjz4" event={"ID":"8b0a63f9-1828-482a-a1f1-d99bf4b8932e","Type":"ContainerDied","Data":"b10cc49a446b3a848b317e6f7ae64a7501f609f446fcd428e86884ceb1c1ceb3"} Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.193878 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b10cc49a446b3a848b317e6f7ae64a7501f609f446fcd428e86884ceb1c1ceb3" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.193935 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kmjz4" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.197221 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0f05-account-create-update-vfvzx" event={"ID":"90b5d891-59ef-43ca-9689-ec2f4bfa590c","Type":"ContainerDied","Data":"53dffe79922877b0635494b33a5f44dc9838d913e074771dd563355c8121fc59"} Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.197249 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53dffe79922877b0635494b33a5f44dc9838d913e074771dd563355c8121fc59" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.197373 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0f05-account-create-update-vfvzx" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.258536 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wccl\" (UniqueName: \"kubernetes.io/projected/aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62-kube-api-access-7wccl\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.258570 5036 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.258579 5036 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b0a63f9-1828-482a-a1f1-d99bf4b8932e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.258588 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgcm7\" (UniqueName: \"kubernetes.io/projected/8b0a63f9-1828-482a-a1f1-d99bf4b8932e-kube-api-access-vgcm7\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.266482 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-813f-account-create-update-m5fwq" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.273358 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-67ce-account-create-update-t7sxt" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.278183 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-bzzkq" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.462267 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4j2r\" (UniqueName: \"kubernetes.io/projected/4c0cdec2-be1f-4169-ade6-cf65905c7003-kube-api-access-x4j2r\") pod \"4c0cdec2-be1f-4169-ade6-cf65905c7003\" (UID: \"4c0cdec2-be1f-4169-ade6-cf65905c7003\") " Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.462402 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlrlw\" (UniqueName: \"kubernetes.io/projected/d206d6b1-be89-44b7-a4db-749bd0113be2-kube-api-access-qlrlw\") pod \"d206d6b1-be89-44b7-a4db-749bd0113be2\" (UID: \"d206d6b1-be89-44b7-a4db-749bd0113be2\") " Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.462439 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c0cdec2-be1f-4169-ade6-cf65905c7003-operator-scripts\") pod \"4c0cdec2-be1f-4169-ade6-cf65905c7003\" (UID: \"4c0cdec2-be1f-4169-ade6-cf65905c7003\") " Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.462481 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d206d6b1-be89-44b7-a4db-749bd0113be2-operator-scripts\") pod \"d206d6b1-be89-44b7-a4db-749bd0113be2\" (UID: \"d206d6b1-be89-44b7-a4db-749bd0113be2\") " Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.462529 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptzq9\" (UniqueName: \"kubernetes.io/projected/edca4de3-92c0-449a-a081-8868ada61ff8-kube-api-access-ptzq9\") pod \"edca4de3-92c0-449a-a081-8868ada61ff8\" (UID: \"edca4de3-92c0-449a-a081-8868ada61ff8\") " Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.462575 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edca4de3-92c0-449a-a081-8868ada61ff8-operator-scripts\") pod \"edca4de3-92c0-449a-a081-8868ada61ff8\" (UID: \"edca4de3-92c0-449a-a081-8868ada61ff8\") " Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.463281 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d206d6b1-be89-44b7-a4db-749bd0113be2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d206d6b1-be89-44b7-a4db-749bd0113be2" (UID: "d206d6b1-be89-44b7-a4db-749bd0113be2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.463429 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edca4de3-92c0-449a-a081-8868ada61ff8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "edca4de3-92c0-449a-a081-8868ada61ff8" (UID: "edca4de3-92c0-449a-a081-8868ada61ff8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.463828 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c0cdec2-be1f-4169-ade6-cf65905c7003-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4c0cdec2-be1f-4169-ade6-cf65905c7003" (UID: "4c0cdec2-be1f-4169-ade6-cf65905c7003"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.468295 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d206d6b1-be89-44b7-a4db-749bd0113be2-kube-api-access-qlrlw" (OuterVolumeSpecName: "kube-api-access-qlrlw") pod "d206d6b1-be89-44b7-a4db-749bd0113be2" (UID: "d206d6b1-be89-44b7-a4db-749bd0113be2"). InnerVolumeSpecName "kube-api-access-qlrlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.469095 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c0cdec2-be1f-4169-ade6-cf65905c7003-kube-api-access-x4j2r" (OuterVolumeSpecName: "kube-api-access-x4j2r") pod "4c0cdec2-be1f-4169-ade6-cf65905c7003" (UID: "4c0cdec2-be1f-4169-ade6-cf65905c7003"). InnerVolumeSpecName "kube-api-access-x4j2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.469191 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edca4de3-92c0-449a-a081-8868ada61ff8-kube-api-access-ptzq9" (OuterVolumeSpecName: "kube-api-access-ptzq9") pod "edca4de3-92c0-449a-a081-8868ada61ff8" (UID: "edca4de3-92c0-449a-a081-8868ada61ff8"). InnerVolumeSpecName "kube-api-access-ptzq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.564158 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4j2r\" (UniqueName: \"kubernetes.io/projected/4c0cdec2-be1f-4169-ade6-cf65905c7003-kube-api-access-x4j2r\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.564194 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlrlw\" (UniqueName: \"kubernetes.io/projected/d206d6b1-be89-44b7-a4db-749bd0113be2-kube-api-access-qlrlw\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.564204 5036 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c0cdec2-be1f-4169-ade6-cf65905c7003-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.564213 5036 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d206d6b1-be89-44b7-a4db-749bd0113be2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.564221 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptzq9\" (UniqueName: \"kubernetes.io/projected/edca4de3-92c0-449a-a081-8868ada61ff8-kube-api-access-ptzq9\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.564230 5036 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edca4de3-92c0-449a-a081-8868ada61ff8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:11 crc kubenswrapper[5036]: I0110 16:43:11.937165 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6drbk" Jan 10 16:43:12 crc kubenswrapper[5036]: I0110 16:43:12.071505 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftj9k\" (UniqueName: \"kubernetes.io/projected/b720948d-ad09-4e83-8451-34e2b039f1d1-kube-api-access-ftj9k\") pod \"b720948d-ad09-4e83-8451-34e2b039f1d1\" (UID: \"b720948d-ad09-4e83-8451-34e2b039f1d1\") " Jan 10 16:43:12 crc kubenswrapper[5036]: I0110 16:43:12.071585 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b720948d-ad09-4e83-8451-34e2b039f1d1-operator-scripts\") pod \"b720948d-ad09-4e83-8451-34e2b039f1d1\" (UID: \"b720948d-ad09-4e83-8451-34e2b039f1d1\") " Jan 10 16:43:12 crc kubenswrapper[5036]: I0110 16:43:12.072205 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b720948d-ad09-4e83-8451-34e2b039f1d1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b720948d-ad09-4e83-8451-34e2b039f1d1" (UID: "b720948d-ad09-4e83-8451-34e2b039f1d1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:43:12 crc kubenswrapper[5036]: I0110 16:43:12.074836 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b720948d-ad09-4e83-8451-34e2b039f1d1-kube-api-access-ftj9k" (OuterVolumeSpecName: "kube-api-access-ftj9k") pod "b720948d-ad09-4e83-8451-34e2b039f1d1" (UID: "b720948d-ad09-4e83-8451-34e2b039f1d1"). InnerVolumeSpecName "kube-api-access-ftj9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:43:12 crc kubenswrapper[5036]: I0110 16:43:12.173861 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftj9k\" (UniqueName: \"kubernetes.io/projected/b720948d-ad09-4e83-8451-34e2b039f1d1-kube-api-access-ftj9k\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:12 crc kubenswrapper[5036]: I0110 16:43:12.173921 5036 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b720948d-ad09-4e83-8451-34e2b039f1d1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:12 crc kubenswrapper[5036]: I0110 16:43:12.207176 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-67ce-account-create-update-t7sxt" event={"ID":"4c0cdec2-be1f-4169-ade6-cf65905c7003","Type":"ContainerDied","Data":"41ccba2ff9cd28075be02266cbf8dffd90dd56c198696d2efae0d636b58a22cf"} Jan 10 16:43:12 crc kubenswrapper[5036]: I0110 16:43:12.207225 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41ccba2ff9cd28075be02266cbf8dffd90dd56c198696d2efae0d636b58a22cf" Jan 10 16:43:12 crc kubenswrapper[5036]: I0110 16:43:12.207294 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-67ce-account-create-update-t7sxt" Jan 10 16:43:12 crc kubenswrapper[5036]: I0110 16:43:12.208819 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-bzzkq" Jan 10 16:43:12 crc kubenswrapper[5036]: I0110 16:43:12.208818 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-bzzkq" event={"ID":"d206d6b1-be89-44b7-a4db-749bd0113be2","Type":"ContainerDied","Data":"f6f3f3c91975a154781d427a77e663a90f9c5a31ee53f121172861142133d01d"} Jan 10 16:43:12 crc kubenswrapper[5036]: I0110 16:43:12.208960 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6f3f3c91975a154781d427a77e663a90f9c5a31ee53f121172861142133d01d" Jan 10 16:43:12 crc kubenswrapper[5036]: I0110 16:43:12.210296 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-813f-account-create-update-m5fwq" Jan 10 16:43:12 crc kubenswrapper[5036]: I0110 16:43:12.210287 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6drbk" event={"ID":"b720948d-ad09-4e83-8451-34e2b039f1d1","Type":"ContainerDied","Data":"b3a943189613822ea98cbd70e82e4016f3c11e87ee96fdd14941fdd137d61e66"} Jan 10 16:43:12 crc kubenswrapper[5036]: I0110 16:43:12.210443 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3a943189613822ea98cbd70e82e4016f3c11e87ee96fdd14941fdd137d61e66" Jan 10 16:43:12 crc kubenswrapper[5036]: I0110 16:43:12.210591 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6drbk" Jan 10 16:43:14 crc kubenswrapper[5036]: I0110 16:43:14.442374 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-6drbk"] Jan 10 16:43:14 crc kubenswrapper[5036]: I0110 16:43:14.450455 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-6drbk"] Jan 10 16:43:14 crc kubenswrapper[5036]: I0110 16:43:14.538533 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b720948d-ad09-4e83-8451-34e2b039f1d1" path="/var/lib/kubelet/pods/b720948d-ad09-4e83-8451-34e2b039f1d1/volumes" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.264093 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cd708bfb-a557-401f-b815-16d584c8eb78","Type":"ContainerStarted","Data":"e4b148b346076aa377d8d88ff1b23fc00085057346777410276419385187b299"} Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.264302 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.289315 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=53.750582614 podStartE2EDuration="1m5.289294607s" podCreationTimestamp="2026-01-10 16:42:11 +0000 UTC" firstStartedPulling="2026-01-10 16:42:21.01146181 +0000 UTC m=+862.881697304" lastFinishedPulling="2026-01-10 16:42:32.550173803 +0000 UTC m=+874.420409297" observedRunningTime="2026-01-10 16:43:16.285201571 +0000 UTC m=+918.155437085" watchObservedRunningTime="2026-01-10 16:43:16.289294607 +0000 UTC m=+918.159530101" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.545223 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-lj72s"] Jan 10 16:43:16 crc kubenswrapper[5036]: E0110 16:43:16.545778 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62" containerName="mariadb-database-create" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.545795 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62" containerName="mariadb-database-create" Jan 10 16:43:16 crc kubenswrapper[5036]: E0110 16:43:16.545816 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0cdec2-be1f-4169-ade6-cf65905c7003" containerName="mariadb-account-create-update" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.545822 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0cdec2-be1f-4169-ade6-cf65905c7003" containerName="mariadb-account-create-update" Jan 10 16:43:16 crc kubenswrapper[5036]: E0110 16:43:16.545837 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b0a63f9-1828-482a-a1f1-d99bf4b8932e" containerName="mariadb-database-create" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.545843 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b0a63f9-1828-482a-a1f1-d99bf4b8932e" containerName="mariadb-database-create" Jan 10 16:43:16 crc kubenswrapper[5036]: E0110 16:43:16.545853 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edca4de3-92c0-449a-a081-8868ada61ff8" containerName="mariadb-account-create-update" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.545859 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="edca4de3-92c0-449a-a081-8868ada61ff8" containerName="mariadb-account-create-update" Jan 10 16:43:16 crc kubenswrapper[5036]: E0110 16:43:16.545871 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d206d6b1-be89-44b7-a4db-749bd0113be2" containerName="mariadb-database-create" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.545876 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="d206d6b1-be89-44b7-a4db-749bd0113be2" containerName="mariadb-database-create" Jan 10 16:43:16 crc kubenswrapper[5036]: E0110 16:43:16.545891 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b720948d-ad09-4e83-8451-34e2b039f1d1" containerName="mariadb-account-create-update" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.545897 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="b720948d-ad09-4e83-8451-34e2b039f1d1" containerName="mariadb-account-create-update" Jan 10 16:43:16 crc kubenswrapper[5036]: E0110 16:43:16.545918 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90b5d891-59ef-43ca-9689-ec2f4bfa590c" containerName="mariadb-account-create-update" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.545924 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b5d891-59ef-43ca-9689-ec2f4bfa590c" containerName="mariadb-account-create-update" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.546050 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="d206d6b1-be89-44b7-a4db-749bd0113be2" containerName="mariadb-database-create" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.546058 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b0a63f9-1828-482a-a1f1-d99bf4b8932e" containerName="mariadb-database-create" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.546069 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c0cdec2-be1f-4169-ade6-cf65905c7003" containerName="mariadb-account-create-update" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.546083 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62" containerName="mariadb-database-create" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.546091 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="90b5d891-59ef-43ca-9689-ec2f4bfa590c" containerName="mariadb-account-create-update" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.546100 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="b720948d-ad09-4e83-8451-34e2b039f1d1" containerName="mariadb-account-create-update" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.546109 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="edca4de3-92c0-449a-a081-8868ada61ff8" containerName="mariadb-account-create-update" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.546563 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lj72s" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.550951 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.555360 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-kspfm" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.566808 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-lj72s"] Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.669977 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a8e315-dd60-47a9-b03c-0897b6f21b3d-combined-ca-bundle\") pod \"glance-db-sync-lj72s\" (UID: \"09a8e315-dd60-47a9-b03c-0897b6f21b3d\") " pod="openstack/glance-db-sync-lj72s" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.670061 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a8e315-dd60-47a9-b03c-0897b6f21b3d-config-data\") pod \"glance-db-sync-lj72s\" (UID: \"09a8e315-dd60-47a9-b03c-0897b6f21b3d\") " pod="openstack/glance-db-sync-lj72s" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.670079 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/09a8e315-dd60-47a9-b03c-0897b6f21b3d-db-sync-config-data\") pod \"glance-db-sync-lj72s\" (UID: \"09a8e315-dd60-47a9-b03c-0897b6f21b3d\") " pod="openstack/glance-db-sync-lj72s" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.670246 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsg5n\" (UniqueName: \"kubernetes.io/projected/09a8e315-dd60-47a9-b03c-0897b6f21b3d-kube-api-access-qsg5n\") pod \"glance-db-sync-lj72s\" (UID: \"09a8e315-dd60-47a9-b03c-0897b6f21b3d\") " pod="openstack/glance-db-sync-lj72s" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.772090 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a8e315-dd60-47a9-b03c-0897b6f21b3d-combined-ca-bundle\") pod \"glance-db-sync-lj72s\" (UID: \"09a8e315-dd60-47a9-b03c-0897b6f21b3d\") " pod="openstack/glance-db-sync-lj72s" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.772162 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/09a8e315-dd60-47a9-b03c-0897b6f21b3d-db-sync-config-data\") pod \"glance-db-sync-lj72s\" (UID: \"09a8e315-dd60-47a9-b03c-0897b6f21b3d\") " pod="openstack/glance-db-sync-lj72s" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.772199 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a8e315-dd60-47a9-b03c-0897b6f21b3d-config-data\") pod \"glance-db-sync-lj72s\" (UID: \"09a8e315-dd60-47a9-b03c-0897b6f21b3d\") " pod="openstack/glance-db-sync-lj72s" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.772260 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsg5n\" (UniqueName: \"kubernetes.io/projected/09a8e315-dd60-47a9-b03c-0897b6f21b3d-kube-api-access-qsg5n\") pod \"glance-db-sync-lj72s\" (UID: \"09a8e315-dd60-47a9-b03c-0897b6f21b3d\") " pod="openstack/glance-db-sync-lj72s" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.778416 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a8e315-dd60-47a9-b03c-0897b6f21b3d-config-data\") pod \"glance-db-sync-lj72s\" (UID: \"09a8e315-dd60-47a9-b03c-0897b6f21b3d\") " pod="openstack/glance-db-sync-lj72s" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.778632 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/09a8e315-dd60-47a9-b03c-0897b6f21b3d-db-sync-config-data\") pod \"glance-db-sync-lj72s\" (UID: \"09a8e315-dd60-47a9-b03c-0897b6f21b3d\") " pod="openstack/glance-db-sync-lj72s" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.779270 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a8e315-dd60-47a9-b03c-0897b6f21b3d-combined-ca-bundle\") pod \"glance-db-sync-lj72s\" (UID: \"09a8e315-dd60-47a9-b03c-0897b6f21b3d\") " pod="openstack/glance-db-sync-lj72s" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.789922 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsg5n\" (UniqueName: \"kubernetes.io/projected/09a8e315-dd60-47a9-b03c-0897b6f21b3d-kube-api-access-qsg5n\") pod \"glance-db-sync-lj72s\" (UID: \"09a8e315-dd60-47a9-b03c-0897b6f21b3d\") " pod="openstack/glance-db-sync-lj72s" Jan 10 16:43:16 crc kubenswrapper[5036]: I0110 16:43:16.865782 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lj72s" Jan 10 16:43:17 crc kubenswrapper[5036]: I0110 16:43:17.514802 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-lj72s"] Jan 10 16:43:18 crc kubenswrapper[5036]: I0110 16:43:18.296339 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lj72s" event={"ID":"09a8e315-dd60-47a9-b03c-0897b6f21b3d","Type":"ContainerStarted","Data":"020c7634d681520529bea2eb7d907b5d0f5b6dc20ad78d6402d0a80072b7c3f6"} Jan 10 16:43:19 crc kubenswrapper[5036]: I0110 16:43:19.442848 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-5msnc"] Jan 10 16:43:19 crc kubenswrapper[5036]: I0110 16:43:19.443824 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5msnc" Jan 10 16:43:19 crc kubenswrapper[5036]: I0110 16:43:19.451779 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5msnc"] Jan 10 16:43:19 crc kubenswrapper[5036]: I0110 16:43:19.460774 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 10 16:43:19 crc kubenswrapper[5036]: I0110 16:43:19.577547 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7c7eb4b-3f80-4f63-812b-7001e40c872f-operator-scripts\") pod \"root-account-create-update-5msnc\" (UID: \"d7c7eb4b-3f80-4f63-812b-7001e40c872f\") " pod="openstack/root-account-create-update-5msnc" Jan 10 16:43:19 crc kubenswrapper[5036]: I0110 16:43:19.577722 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5n9t\" (UniqueName: \"kubernetes.io/projected/d7c7eb4b-3f80-4f63-812b-7001e40c872f-kube-api-access-p5n9t\") pod \"root-account-create-update-5msnc\" (UID: \"d7c7eb4b-3f80-4f63-812b-7001e40c872f\") " pod="openstack/root-account-create-update-5msnc" Jan 10 16:43:19 crc kubenswrapper[5036]: I0110 16:43:19.679209 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5n9t\" (UniqueName: \"kubernetes.io/projected/d7c7eb4b-3f80-4f63-812b-7001e40c872f-kube-api-access-p5n9t\") pod \"root-account-create-update-5msnc\" (UID: \"d7c7eb4b-3f80-4f63-812b-7001e40c872f\") " pod="openstack/root-account-create-update-5msnc" Jan 10 16:43:19 crc kubenswrapper[5036]: I0110 16:43:19.679265 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7c7eb4b-3f80-4f63-812b-7001e40c872f-operator-scripts\") pod \"root-account-create-update-5msnc\" (UID: \"d7c7eb4b-3f80-4f63-812b-7001e40c872f\") " pod="openstack/root-account-create-update-5msnc" Jan 10 16:43:19 crc kubenswrapper[5036]: I0110 16:43:19.680175 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7c7eb4b-3f80-4f63-812b-7001e40c872f-operator-scripts\") pod \"root-account-create-update-5msnc\" (UID: \"d7c7eb4b-3f80-4f63-812b-7001e40c872f\") " pod="openstack/root-account-create-update-5msnc" Jan 10 16:43:19 crc kubenswrapper[5036]: I0110 16:43:19.719011 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5n9t\" (UniqueName: \"kubernetes.io/projected/d7c7eb4b-3f80-4f63-812b-7001e40c872f-kube-api-access-p5n9t\") pod \"root-account-create-update-5msnc\" (UID: \"d7c7eb4b-3f80-4f63-812b-7001e40c872f\") " pod="openstack/root-account-create-update-5msnc" Jan 10 16:43:19 crc kubenswrapper[5036]: I0110 16:43:19.761041 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5msnc" Jan 10 16:43:20 crc kubenswrapper[5036]: I0110 16:43:20.381934 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5msnc"] Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.315158 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-czqbw" podUID="be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4" containerName="ovn-controller" probeResult="failure" output=< Jan 10 16:43:21 crc kubenswrapper[5036]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 10 16:43:21 crc kubenswrapper[5036]: > Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.324783 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-vsd6b" Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.363971 5036 generic.go:334] "Generic (PLEG): container finished" podID="d7c7eb4b-3f80-4f63-812b-7001e40c872f" containerID="51e7cb40a9e19b63c2f4d863ea9b6db94bb0c644585189fdf3fb938b9224b235" exitCode=0 Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.364100 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5msnc" event={"ID":"d7c7eb4b-3f80-4f63-812b-7001e40c872f","Type":"ContainerDied","Data":"51e7cb40a9e19b63c2f4d863ea9b6db94bb0c644585189fdf3fb938b9224b235"} Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.364229 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5msnc" event={"ID":"d7c7eb4b-3f80-4f63-812b-7001e40c872f","Type":"ContainerStarted","Data":"aaf4a19c8ad0d8ab656c3161ec50af80332fe632dd4a01a616e326333a3a1c8f"} Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.380565 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-vsd6b" Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.673216 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-czqbw-config-pmdmt"] Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.674771 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-czqbw-config-pmdmt" Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.677590 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.690392 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-czqbw-config-pmdmt"] Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.869442 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-var-run-ovn\") pod \"ovn-controller-czqbw-config-pmdmt\" (UID: \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\") " pod="openstack/ovn-controller-czqbw-config-pmdmt" Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.869528 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-scripts\") pod \"ovn-controller-czqbw-config-pmdmt\" (UID: \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\") " pod="openstack/ovn-controller-czqbw-config-pmdmt" Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.869584 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-var-log-ovn\") pod \"ovn-controller-czqbw-config-pmdmt\" (UID: \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\") " pod="openstack/ovn-controller-czqbw-config-pmdmt" Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.869663 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-var-run\") pod \"ovn-controller-czqbw-config-pmdmt\" (UID: \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\") " pod="openstack/ovn-controller-czqbw-config-pmdmt" Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.869743 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-additional-scripts\") pod \"ovn-controller-czqbw-config-pmdmt\" (UID: \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\") " pod="openstack/ovn-controller-czqbw-config-pmdmt" Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.869767 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9dvg\" (UniqueName: \"kubernetes.io/projected/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-kube-api-access-r9dvg\") pod \"ovn-controller-czqbw-config-pmdmt\" (UID: \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\") " pod="openstack/ovn-controller-czqbw-config-pmdmt" Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.971475 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-scripts\") pod \"ovn-controller-czqbw-config-pmdmt\" (UID: \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\") " pod="openstack/ovn-controller-czqbw-config-pmdmt" Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.971557 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-var-log-ovn\") pod \"ovn-controller-czqbw-config-pmdmt\" (UID: \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\") " pod="openstack/ovn-controller-czqbw-config-pmdmt" Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.971624 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-var-run\") pod \"ovn-controller-czqbw-config-pmdmt\" (UID: \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\") " pod="openstack/ovn-controller-czqbw-config-pmdmt" Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.971748 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-additional-scripts\") pod \"ovn-controller-czqbw-config-pmdmt\" (UID: \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\") " pod="openstack/ovn-controller-czqbw-config-pmdmt" Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.971774 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9dvg\" (UniqueName: \"kubernetes.io/projected/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-kube-api-access-r9dvg\") pod \"ovn-controller-czqbw-config-pmdmt\" (UID: \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\") " pod="openstack/ovn-controller-czqbw-config-pmdmt" Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.971840 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-var-run-ovn\") pod \"ovn-controller-czqbw-config-pmdmt\" (UID: \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\") " pod="openstack/ovn-controller-czqbw-config-pmdmt" Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.972205 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-var-run-ovn\") pod \"ovn-controller-czqbw-config-pmdmt\" (UID: \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\") " pod="openstack/ovn-controller-czqbw-config-pmdmt" Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.972284 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-var-run\") pod \"ovn-controller-czqbw-config-pmdmt\" (UID: \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\") " pod="openstack/ovn-controller-czqbw-config-pmdmt" Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.973250 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-var-log-ovn\") pod \"ovn-controller-czqbw-config-pmdmt\" (UID: \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\") " pod="openstack/ovn-controller-czqbw-config-pmdmt" Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.973977 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-additional-scripts\") pod \"ovn-controller-czqbw-config-pmdmt\" (UID: \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\") " pod="openstack/ovn-controller-czqbw-config-pmdmt" Jan 10 16:43:21 crc kubenswrapper[5036]: I0110 16:43:21.975142 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-scripts\") pod \"ovn-controller-czqbw-config-pmdmt\" (UID: \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\") " pod="openstack/ovn-controller-czqbw-config-pmdmt" Jan 10 16:43:22 crc kubenswrapper[5036]: I0110 16:43:22.000707 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9dvg\" (UniqueName: \"kubernetes.io/projected/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-kube-api-access-r9dvg\") pod \"ovn-controller-czqbw-config-pmdmt\" (UID: \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\") " pod="openstack/ovn-controller-czqbw-config-pmdmt" Jan 10 16:43:22 crc kubenswrapper[5036]: I0110 16:43:22.292264 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-czqbw-config-pmdmt" Jan 10 16:43:22 crc kubenswrapper[5036]: I0110 16:43:22.848094 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5msnc" Jan 10 16:43:22 crc kubenswrapper[5036]: I0110 16:43:22.890245 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-czqbw-config-pmdmt"] Jan 10 16:43:22 crc kubenswrapper[5036]: W0110 16:43:22.898263 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94923d86_2e86_4f62_bf40_b5c44fa2eaa2.slice/crio-7234b250192602c44668e45217d85e8e2ba97039246405405db67dc63b164fa6 WatchSource:0}: Error finding container 7234b250192602c44668e45217d85e8e2ba97039246405405db67dc63b164fa6: Status 404 returned error can't find the container with id 7234b250192602c44668e45217d85e8e2ba97039246405405db67dc63b164fa6 Jan 10 16:43:22 crc kubenswrapper[5036]: I0110 16:43:22.985940 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7c7eb4b-3f80-4f63-812b-7001e40c872f-operator-scripts\") pod \"d7c7eb4b-3f80-4f63-812b-7001e40c872f\" (UID: \"d7c7eb4b-3f80-4f63-812b-7001e40c872f\") " Jan 10 16:43:22 crc kubenswrapper[5036]: I0110 16:43:22.986285 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5n9t\" (UniqueName: \"kubernetes.io/projected/d7c7eb4b-3f80-4f63-812b-7001e40c872f-kube-api-access-p5n9t\") pod \"d7c7eb4b-3f80-4f63-812b-7001e40c872f\" (UID: \"d7c7eb4b-3f80-4f63-812b-7001e40c872f\") " Jan 10 16:43:22 crc kubenswrapper[5036]: I0110 16:43:22.987040 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7c7eb4b-3f80-4f63-812b-7001e40c872f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d7c7eb4b-3f80-4f63-812b-7001e40c872f" (UID: "d7c7eb4b-3f80-4f63-812b-7001e40c872f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:43:22 crc kubenswrapper[5036]: I0110 16:43:22.992163 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7c7eb4b-3f80-4f63-812b-7001e40c872f-kube-api-access-p5n9t" (OuterVolumeSpecName: "kube-api-access-p5n9t") pod "d7c7eb4b-3f80-4f63-812b-7001e40c872f" (UID: "d7c7eb4b-3f80-4f63-812b-7001e40c872f"). InnerVolumeSpecName "kube-api-access-p5n9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:43:23 crc kubenswrapper[5036]: I0110 16:43:23.087628 5036 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7c7eb4b-3f80-4f63-812b-7001e40c872f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:23 crc kubenswrapper[5036]: I0110 16:43:23.087665 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5n9t\" (UniqueName: \"kubernetes.io/projected/d7c7eb4b-3f80-4f63-812b-7001e40c872f-kube-api-access-p5n9t\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:23 crc kubenswrapper[5036]: I0110 16:43:23.205871 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:43:23 crc kubenswrapper[5036]: I0110 16:43:23.384026 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5msnc" event={"ID":"d7c7eb4b-3f80-4f63-812b-7001e40c872f","Type":"ContainerDied","Data":"aaf4a19c8ad0d8ab656c3161ec50af80332fe632dd4a01a616e326333a3a1c8f"} Jan 10 16:43:23 crc kubenswrapper[5036]: I0110 16:43:23.384071 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aaf4a19c8ad0d8ab656c3161ec50af80332fe632dd4a01a616e326333a3a1c8f" Jan 10 16:43:23 crc kubenswrapper[5036]: I0110 16:43:23.384152 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5msnc" Jan 10 16:43:23 crc kubenswrapper[5036]: I0110 16:43:23.388549 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-czqbw-config-pmdmt" event={"ID":"94923d86-2e86-4f62-bf40-b5c44fa2eaa2","Type":"ContainerStarted","Data":"68ae9389e51072775c29153c00dd3c84a5516d103a50ab9e818f1ccdae8235a2"} Jan 10 16:43:23 crc kubenswrapper[5036]: I0110 16:43:23.388591 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-czqbw-config-pmdmt" event={"ID":"94923d86-2e86-4f62-bf40-b5c44fa2eaa2","Type":"ContainerStarted","Data":"7234b250192602c44668e45217d85e8e2ba97039246405405db67dc63b164fa6"} Jan 10 16:43:23 crc kubenswrapper[5036]: I0110 16:43:23.888635 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-czqbw-config-pmdmt" podStartSLOduration=2.888614791 podStartE2EDuration="2.888614791s" podCreationTimestamp="2026-01-10 16:43:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:43:23.422471126 +0000 UTC m=+925.292706630" watchObservedRunningTime="2026-01-10 16:43:23.888614791 +0000 UTC m=+925.758850285" Jan 10 16:43:24 crc kubenswrapper[5036]: I0110 16:43:24.397444 5036 generic.go:334] "Generic (PLEG): container finished" podID="94923d86-2e86-4f62-bf40-b5c44fa2eaa2" containerID="68ae9389e51072775c29153c00dd3c84a5516d103a50ab9e818f1ccdae8235a2" exitCode=0 Jan 10 16:43:24 crc kubenswrapper[5036]: I0110 16:43:24.397492 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-czqbw-config-pmdmt" event={"ID":"94923d86-2e86-4f62-bf40-b5c44fa2eaa2","Type":"ContainerDied","Data":"68ae9389e51072775c29153c00dd3c84a5516d103a50ab9e818f1ccdae8235a2"} Jan 10 16:43:26 crc kubenswrapper[5036]: I0110 16:43:26.355188 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-czqbw" Jan 10 16:43:32 crc kubenswrapper[5036]: E0110 16:43:32.664774 5036 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Jan 10 16:43:32 crc kubenswrapper[5036]: E0110 16:43:32.665519 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qsg5n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-lj72s_openstack(09a8e315-dd60-47a9-b03c-0897b6f21b3d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 10 16:43:32 crc kubenswrapper[5036]: E0110 16:43:32.666649 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-lj72s" podUID="09a8e315-dd60-47a9-b03c-0897b6f21b3d" Jan 10 16:43:32 crc kubenswrapper[5036]: I0110 16:43:32.675208 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-czqbw-config-pmdmt" Jan 10 16:43:32 crc kubenswrapper[5036]: I0110 16:43:32.775894 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 10 16:43:32 crc kubenswrapper[5036]: I0110 16:43:32.820280 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-var-log-ovn\") pod \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\" (UID: \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\") " Jan 10 16:43:32 crc kubenswrapper[5036]: I0110 16:43:32.820414 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-additional-scripts\") pod \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\" (UID: \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\") " Jan 10 16:43:32 crc kubenswrapper[5036]: I0110 16:43:32.820529 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-scripts\") pod \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\" (UID: \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\") " Jan 10 16:43:32 crc kubenswrapper[5036]: I0110 16:43:32.820575 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-var-run\") pod \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\" (UID: \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\") " Jan 10 16:43:32 crc kubenswrapper[5036]: I0110 16:43:32.820617 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-var-run-ovn\") pod \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\" (UID: \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\") " Jan 10 16:43:32 crc kubenswrapper[5036]: I0110 16:43:32.820669 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9dvg\" (UniqueName: \"kubernetes.io/projected/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-kube-api-access-r9dvg\") pod \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\" (UID: \"94923d86-2e86-4f62-bf40-b5c44fa2eaa2\") " Jan 10 16:43:32 crc kubenswrapper[5036]: I0110 16:43:32.820410 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "94923d86-2e86-4f62-bf40-b5c44fa2eaa2" (UID: "94923d86-2e86-4f62-bf40-b5c44fa2eaa2"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:43:32 crc kubenswrapper[5036]: I0110 16:43:32.821064 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-var-run" (OuterVolumeSpecName: "var-run") pod "94923d86-2e86-4f62-bf40-b5c44fa2eaa2" (UID: "94923d86-2e86-4f62-bf40-b5c44fa2eaa2"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:43:32 crc kubenswrapper[5036]: I0110 16:43:32.821199 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "94923d86-2e86-4f62-bf40-b5c44fa2eaa2" (UID: "94923d86-2e86-4f62-bf40-b5c44fa2eaa2"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:43:32 crc kubenswrapper[5036]: I0110 16:43:32.821295 5036 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:32 crc kubenswrapper[5036]: I0110 16:43:32.821395 5036 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-var-run\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:32 crc kubenswrapper[5036]: I0110 16:43:32.821692 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-scripts" (OuterVolumeSpecName: "scripts") pod "94923d86-2e86-4f62-bf40-b5c44fa2eaa2" (UID: "94923d86-2e86-4f62-bf40-b5c44fa2eaa2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:43:32 crc kubenswrapper[5036]: I0110 16:43:32.821981 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "94923d86-2e86-4f62-bf40-b5c44fa2eaa2" (UID: "94923d86-2e86-4f62-bf40-b5c44fa2eaa2"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:43:32 crc kubenswrapper[5036]: I0110 16:43:32.826762 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-kube-api-access-r9dvg" (OuterVolumeSpecName: "kube-api-access-r9dvg") pod "94923d86-2e86-4f62-bf40-b5c44fa2eaa2" (UID: "94923d86-2e86-4f62-bf40-b5c44fa2eaa2"). InnerVolumeSpecName "kube-api-access-r9dvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:43:32 crc kubenswrapper[5036]: I0110 16:43:32.923180 5036 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:32 crc kubenswrapper[5036]: I0110 16:43:32.923213 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:32 crc kubenswrapper[5036]: I0110 16:43:32.923223 5036 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:32 crc kubenswrapper[5036]: I0110 16:43:32.923233 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9dvg\" (UniqueName: \"kubernetes.io/projected/94923d86-2e86-4f62-bf40-b5c44fa2eaa2-kube-api-access-r9dvg\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.052190 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-kmlkm"] Jan 10 16:43:33 crc kubenswrapper[5036]: E0110 16:43:33.052584 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c7eb4b-3f80-4f63-812b-7001e40c872f" containerName="mariadb-account-create-update" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.052622 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c7eb4b-3f80-4f63-812b-7001e40c872f" containerName="mariadb-account-create-update" Jan 10 16:43:33 crc kubenswrapper[5036]: E0110 16:43:33.052639 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94923d86-2e86-4f62-bf40-b5c44fa2eaa2" containerName="ovn-config" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.052648 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="94923d86-2e86-4f62-bf40-b5c44fa2eaa2" containerName="ovn-config" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.052837 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7c7eb4b-3f80-4f63-812b-7001e40c872f" containerName="mariadb-account-create-update" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.052867 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="94923d86-2e86-4f62-bf40-b5c44fa2eaa2" containerName="ovn-config" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.053474 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kmlkm" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.073815 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-kmlkm"] Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.141641 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-ls5rk"] Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.142716 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ls5rk" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.149931 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-ls5rk"] Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.227437 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/436d1751-fb2f-45ab-a1c8-a64e3f8b628f-operator-scripts\") pod \"cinder-db-create-kmlkm\" (UID: \"436d1751-fb2f-45ab-a1c8-a64e3f8b628f\") " pod="openstack/cinder-db-create-kmlkm" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.227645 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v86zf\" (UniqueName: \"kubernetes.io/projected/436d1751-fb2f-45ab-a1c8-a64e3f8b628f-kube-api-access-v86zf\") pod \"cinder-db-create-kmlkm\" (UID: \"436d1751-fb2f-45ab-a1c8-a64e3f8b628f\") " pod="openstack/cinder-db-create-kmlkm" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.255404 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-468c-account-create-update-2r8fm"] Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.256472 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-468c-account-create-update-2r8fm" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.259343 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.274498 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-468c-account-create-update-2r8fm"] Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.328975 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v86zf\" (UniqueName: \"kubernetes.io/projected/436d1751-fb2f-45ab-a1c8-a64e3f8b628f-kube-api-access-v86zf\") pod \"cinder-db-create-kmlkm\" (UID: \"436d1751-fb2f-45ab-a1c8-a64e3f8b628f\") " pod="openstack/cinder-db-create-kmlkm" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.329022 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtp7k\" (UniqueName: \"kubernetes.io/projected/9d244a3f-4202-469c-a576-b14fb7323180-kube-api-access-jtp7k\") pod \"barbican-db-create-ls5rk\" (UID: \"9d244a3f-4202-469c-a576-b14fb7323180\") " pod="openstack/barbican-db-create-ls5rk" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.329081 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/436d1751-fb2f-45ab-a1c8-a64e3f8b628f-operator-scripts\") pod \"cinder-db-create-kmlkm\" (UID: \"436d1751-fb2f-45ab-a1c8-a64e3f8b628f\") " pod="openstack/cinder-db-create-kmlkm" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.329120 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d244a3f-4202-469c-a576-b14fb7323180-operator-scripts\") pod \"barbican-db-create-ls5rk\" (UID: \"9d244a3f-4202-469c-a576-b14fb7323180\") " pod="openstack/barbican-db-create-ls5rk" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.329933 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/436d1751-fb2f-45ab-a1c8-a64e3f8b628f-operator-scripts\") pod \"cinder-db-create-kmlkm\" (UID: \"436d1751-fb2f-45ab-a1c8-a64e3f8b628f\") " pod="openstack/cinder-db-create-kmlkm" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.354902 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-lk484"] Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.356065 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lk484" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.357276 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v86zf\" (UniqueName: \"kubernetes.io/projected/436d1751-fb2f-45ab-a1c8-a64e3f8b628f-kube-api-access-v86zf\") pod \"cinder-db-create-kmlkm\" (UID: \"436d1751-fb2f-45ab-a1c8-a64e3f8b628f\") " pod="openstack/cinder-db-create-kmlkm" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.366301 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-21a4-account-create-update-qpvgs"] Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.367515 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-21a4-account-create-update-qpvgs" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.371052 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.395400 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kmlkm" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.398160 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-lk484"] Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.417317 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-21a4-account-create-update-qpvgs"] Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.430779 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d244a3f-4202-469c-a576-b14fb7323180-operator-scripts\") pod \"barbican-db-create-ls5rk\" (UID: \"9d244a3f-4202-469c-a576-b14fb7323180\") " pod="openstack/barbican-db-create-ls5rk" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.430889 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtp7k\" (UniqueName: \"kubernetes.io/projected/9d244a3f-4202-469c-a576-b14fb7323180-kube-api-access-jtp7k\") pod \"barbican-db-create-ls5rk\" (UID: \"9d244a3f-4202-469c-a576-b14fb7323180\") " pod="openstack/barbican-db-create-ls5rk" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.430925 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97027cc1-5cae-4bbf-8b11-5f7103ba4f09-operator-scripts\") pod \"barbican-468c-account-create-update-2r8fm\" (UID: \"97027cc1-5cae-4bbf-8b11-5f7103ba4f09\") " pod="openstack/barbican-468c-account-create-update-2r8fm" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.430962 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmdlx\" (UniqueName: \"kubernetes.io/projected/97027cc1-5cae-4bbf-8b11-5f7103ba4f09-kube-api-access-tmdlx\") pod \"barbican-468c-account-create-update-2r8fm\" (UID: \"97027cc1-5cae-4bbf-8b11-5f7103ba4f09\") " pod="openstack/barbican-468c-account-create-update-2r8fm" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.431704 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d244a3f-4202-469c-a576-b14fb7323180-operator-scripts\") pod \"barbican-db-create-ls5rk\" (UID: \"9d244a3f-4202-469c-a576-b14fb7323180\") " pod="openstack/barbican-db-create-ls5rk" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.462324 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtp7k\" (UniqueName: \"kubernetes.io/projected/9d244a3f-4202-469c-a576-b14fb7323180-kube-api-access-jtp7k\") pod \"barbican-db-create-ls5rk\" (UID: \"9d244a3f-4202-469c-a576-b14fb7323180\") " pod="openstack/barbican-db-create-ls5rk" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.508167 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-qpfq8"] Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.509484 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qpfq8" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.512017 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.512185 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-n6pzn" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.512195 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.512321 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.521037 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qpfq8"] Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.531935 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97027cc1-5cae-4bbf-8b11-5f7103ba4f09-operator-scripts\") pod \"barbican-468c-account-create-update-2r8fm\" (UID: \"97027cc1-5cae-4bbf-8b11-5f7103ba4f09\") " pod="openstack/barbican-468c-account-create-update-2r8fm" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.532004 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmdlx\" (UniqueName: \"kubernetes.io/projected/97027cc1-5cae-4bbf-8b11-5f7103ba4f09-kube-api-access-tmdlx\") pod \"barbican-468c-account-create-update-2r8fm\" (UID: \"97027cc1-5cae-4bbf-8b11-5f7103ba4f09\") " pod="openstack/barbican-468c-account-create-update-2r8fm" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.532035 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpg6l\" (UniqueName: \"kubernetes.io/projected/9374853e-04a4-4903-877b-f725f5066bfc-kube-api-access-qpg6l\") pod \"neutron-db-create-lk484\" (UID: \"9374853e-04a4-4903-877b-f725f5066bfc\") " pod="openstack/neutron-db-create-lk484" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.532074 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsbjn\" (UniqueName: \"kubernetes.io/projected/cd7d3ebc-490f-4fbd-a86d-469b3c7f281c-kube-api-access-qsbjn\") pod \"cinder-21a4-account-create-update-qpvgs\" (UID: \"cd7d3ebc-490f-4fbd-a86d-469b3c7f281c\") " pod="openstack/cinder-21a4-account-create-update-qpvgs" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.532119 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd7d3ebc-490f-4fbd-a86d-469b3c7f281c-operator-scripts\") pod \"cinder-21a4-account-create-update-qpvgs\" (UID: \"cd7d3ebc-490f-4fbd-a86d-469b3c7f281c\") " pod="openstack/cinder-21a4-account-create-update-qpvgs" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.532153 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9374853e-04a4-4903-877b-f725f5066bfc-operator-scripts\") pod \"neutron-db-create-lk484\" (UID: \"9374853e-04a4-4903-877b-f725f5066bfc\") " pod="openstack/neutron-db-create-lk484" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.532950 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97027cc1-5cae-4bbf-8b11-5f7103ba4f09-operator-scripts\") pod \"barbican-468c-account-create-update-2r8fm\" (UID: \"97027cc1-5cae-4bbf-8b11-5f7103ba4f09\") " pod="openstack/barbican-468c-account-create-update-2r8fm" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.559328 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmdlx\" (UniqueName: \"kubernetes.io/projected/97027cc1-5cae-4bbf-8b11-5f7103ba4f09-kube-api-access-tmdlx\") pod \"barbican-468c-account-create-update-2r8fm\" (UID: \"97027cc1-5cae-4bbf-8b11-5f7103ba4f09\") " pod="openstack/barbican-468c-account-create-update-2r8fm" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.565853 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c5a3-account-create-update-htbm6"] Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.567189 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c5a3-account-create-update-htbm6" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.572459 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.575186 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c5a3-account-create-update-htbm6"] Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.579486 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-468c-account-create-update-2r8fm" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.619066 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-czqbw-config-pmdmt" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.623504 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-czqbw-config-pmdmt" event={"ID":"94923d86-2e86-4f62-bf40-b5c44fa2eaa2","Type":"ContainerDied","Data":"7234b250192602c44668e45217d85e8e2ba97039246405405db67dc63b164fa6"} Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.623543 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7234b250192602c44668e45217d85e8e2ba97039246405405db67dc63b164fa6" Jan 10 16:43:33 crc kubenswrapper[5036]: E0110 16:43:33.626189 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-lj72s" podUID="09a8e315-dd60-47a9-b03c-0897b6f21b3d" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.634410 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpg6l\" (UniqueName: \"kubernetes.io/projected/9374853e-04a4-4903-877b-f725f5066bfc-kube-api-access-qpg6l\") pod \"neutron-db-create-lk484\" (UID: \"9374853e-04a4-4903-877b-f725f5066bfc\") " pod="openstack/neutron-db-create-lk484" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.634480 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsbjn\" (UniqueName: \"kubernetes.io/projected/cd7d3ebc-490f-4fbd-a86d-469b3c7f281c-kube-api-access-qsbjn\") pod \"cinder-21a4-account-create-update-qpvgs\" (UID: \"cd7d3ebc-490f-4fbd-a86d-469b3c7f281c\") " pod="openstack/cinder-21a4-account-create-update-qpvgs" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.634510 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d623293c-52c5-4236-9a80-1ac9af4517d4-config-data\") pod \"keystone-db-sync-qpfq8\" (UID: \"d623293c-52c5-4236-9a80-1ac9af4517d4\") " pod="openstack/keystone-db-sync-qpfq8" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.634556 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4pvq\" (UniqueName: \"kubernetes.io/projected/d623293c-52c5-4236-9a80-1ac9af4517d4-kube-api-access-h4pvq\") pod \"keystone-db-sync-qpfq8\" (UID: \"d623293c-52c5-4236-9a80-1ac9af4517d4\") " pod="openstack/keystone-db-sync-qpfq8" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.634575 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd7d3ebc-490f-4fbd-a86d-469b3c7f281c-operator-scripts\") pod \"cinder-21a4-account-create-update-qpvgs\" (UID: \"cd7d3ebc-490f-4fbd-a86d-469b3c7f281c\") " pod="openstack/cinder-21a4-account-create-update-qpvgs" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.634623 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9374853e-04a4-4903-877b-f725f5066bfc-operator-scripts\") pod \"neutron-db-create-lk484\" (UID: \"9374853e-04a4-4903-877b-f725f5066bfc\") " pod="openstack/neutron-db-create-lk484" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.634662 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d623293c-52c5-4236-9a80-1ac9af4517d4-combined-ca-bundle\") pod \"keystone-db-sync-qpfq8\" (UID: \"d623293c-52c5-4236-9a80-1ac9af4517d4\") " pod="openstack/keystone-db-sync-qpfq8" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.636247 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd7d3ebc-490f-4fbd-a86d-469b3c7f281c-operator-scripts\") pod \"cinder-21a4-account-create-update-qpvgs\" (UID: \"cd7d3ebc-490f-4fbd-a86d-469b3c7f281c\") " pod="openstack/cinder-21a4-account-create-update-qpvgs" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.636891 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9374853e-04a4-4903-877b-f725f5066bfc-operator-scripts\") pod \"neutron-db-create-lk484\" (UID: \"9374853e-04a4-4903-877b-f725f5066bfc\") " pod="openstack/neutron-db-create-lk484" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.656534 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpg6l\" (UniqueName: \"kubernetes.io/projected/9374853e-04a4-4903-877b-f725f5066bfc-kube-api-access-qpg6l\") pod \"neutron-db-create-lk484\" (UID: \"9374853e-04a4-4903-877b-f725f5066bfc\") " pod="openstack/neutron-db-create-lk484" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.657064 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsbjn\" (UniqueName: \"kubernetes.io/projected/cd7d3ebc-490f-4fbd-a86d-469b3c7f281c-kube-api-access-qsbjn\") pod \"cinder-21a4-account-create-update-qpvgs\" (UID: \"cd7d3ebc-490f-4fbd-a86d-469b3c7f281c\") " pod="openstack/cinder-21a4-account-create-update-qpvgs" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.708573 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lk484" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.716536 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-21a4-account-create-update-qpvgs" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.736612 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/199f55a8-575e-4b45-add1-ed5d4da32d21-operator-scripts\") pod \"neutron-c5a3-account-create-update-htbm6\" (UID: \"199f55a8-575e-4b45-add1-ed5d4da32d21\") " pod="openstack/neutron-c5a3-account-create-update-htbm6" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.736667 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d623293c-52c5-4236-9a80-1ac9af4517d4-config-data\") pod \"keystone-db-sync-qpfq8\" (UID: \"d623293c-52c5-4236-9a80-1ac9af4517d4\") " pod="openstack/keystone-db-sync-qpfq8" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.736835 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4pvq\" (UniqueName: \"kubernetes.io/projected/d623293c-52c5-4236-9a80-1ac9af4517d4-kube-api-access-h4pvq\") pod \"keystone-db-sync-qpfq8\" (UID: \"d623293c-52c5-4236-9a80-1ac9af4517d4\") " pod="openstack/keystone-db-sync-qpfq8" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.736943 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d623293c-52c5-4236-9a80-1ac9af4517d4-combined-ca-bundle\") pod \"keystone-db-sync-qpfq8\" (UID: \"d623293c-52c5-4236-9a80-1ac9af4517d4\") " pod="openstack/keystone-db-sync-qpfq8" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.737012 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpbll\" (UniqueName: \"kubernetes.io/projected/199f55a8-575e-4b45-add1-ed5d4da32d21-kube-api-access-rpbll\") pod \"neutron-c5a3-account-create-update-htbm6\" (UID: \"199f55a8-575e-4b45-add1-ed5d4da32d21\") " pod="openstack/neutron-c5a3-account-create-update-htbm6" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.741329 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d623293c-52c5-4236-9a80-1ac9af4517d4-combined-ca-bundle\") pod \"keystone-db-sync-qpfq8\" (UID: \"d623293c-52c5-4236-9a80-1ac9af4517d4\") " pod="openstack/keystone-db-sync-qpfq8" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.742622 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d623293c-52c5-4236-9a80-1ac9af4517d4-config-data\") pod \"keystone-db-sync-qpfq8\" (UID: \"d623293c-52c5-4236-9a80-1ac9af4517d4\") " pod="openstack/keystone-db-sync-qpfq8" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.760094 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ls5rk" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.980938 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/199f55a8-575e-4b45-add1-ed5d4da32d21-operator-scripts\") pod \"neutron-c5a3-account-create-update-htbm6\" (UID: \"199f55a8-575e-4b45-add1-ed5d4da32d21\") " pod="openstack/neutron-c5a3-account-create-update-htbm6" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.981392 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpbll\" (UniqueName: \"kubernetes.io/projected/199f55a8-575e-4b45-add1-ed5d4da32d21-kube-api-access-rpbll\") pod \"neutron-c5a3-account-create-update-htbm6\" (UID: \"199f55a8-575e-4b45-add1-ed5d4da32d21\") " pod="openstack/neutron-c5a3-account-create-update-htbm6" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.981740 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/199f55a8-575e-4b45-add1-ed5d4da32d21-operator-scripts\") pod \"neutron-c5a3-account-create-update-htbm6\" (UID: \"199f55a8-575e-4b45-add1-ed5d4da32d21\") " pod="openstack/neutron-c5a3-account-create-update-htbm6" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.982199 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4pvq\" (UniqueName: \"kubernetes.io/projected/d623293c-52c5-4236-9a80-1ac9af4517d4-kube-api-access-h4pvq\") pod \"keystone-db-sync-qpfq8\" (UID: \"d623293c-52c5-4236-9a80-1ac9af4517d4\") " pod="openstack/keystone-db-sync-qpfq8" Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.989349 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-czqbw-config-pmdmt"] Jan 10 16:43:33 crc kubenswrapper[5036]: I0110 16:43:33.996632 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-czqbw-config-pmdmt"] Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.012925 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpbll\" (UniqueName: \"kubernetes.io/projected/199f55a8-575e-4b45-add1-ed5d4da32d21-kube-api-access-rpbll\") pod \"neutron-c5a3-account-create-update-htbm6\" (UID: \"199f55a8-575e-4b45-add1-ed5d4da32d21\") " pod="openstack/neutron-c5a3-account-create-update-htbm6" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.061268 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-czqbw-config-zpxp4"] Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.063540 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-czqbw-config-zpxp4" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.067888 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.069862 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-czqbw-config-zpxp4"] Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.082855 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/988e3b90-6f2a-48fc-87ed-858c7643980e-scripts\") pod \"ovn-controller-czqbw-config-zpxp4\" (UID: \"988e3b90-6f2a-48fc-87ed-858c7643980e\") " pod="openstack/ovn-controller-czqbw-config-zpxp4" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.082898 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zssqf\" (UniqueName: \"kubernetes.io/projected/988e3b90-6f2a-48fc-87ed-858c7643980e-kube-api-access-zssqf\") pod \"ovn-controller-czqbw-config-zpxp4\" (UID: \"988e3b90-6f2a-48fc-87ed-858c7643980e\") " pod="openstack/ovn-controller-czqbw-config-zpxp4" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.082922 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/988e3b90-6f2a-48fc-87ed-858c7643980e-var-run\") pod \"ovn-controller-czqbw-config-zpxp4\" (UID: \"988e3b90-6f2a-48fc-87ed-858c7643980e\") " pod="openstack/ovn-controller-czqbw-config-zpxp4" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.082953 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/988e3b90-6f2a-48fc-87ed-858c7643980e-var-log-ovn\") pod \"ovn-controller-czqbw-config-zpxp4\" (UID: \"988e3b90-6f2a-48fc-87ed-858c7643980e\") " pod="openstack/ovn-controller-czqbw-config-zpxp4" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.083008 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/988e3b90-6f2a-48fc-87ed-858c7643980e-additional-scripts\") pod \"ovn-controller-czqbw-config-zpxp4\" (UID: \"988e3b90-6f2a-48fc-87ed-858c7643980e\") " pod="openstack/ovn-controller-czqbw-config-zpxp4" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.083034 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/988e3b90-6f2a-48fc-87ed-858c7643980e-var-run-ovn\") pod \"ovn-controller-czqbw-config-zpxp4\" (UID: \"988e3b90-6f2a-48fc-87ed-858c7643980e\") " pod="openstack/ovn-controller-czqbw-config-zpxp4" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.137444 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qpfq8" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.183697 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/988e3b90-6f2a-48fc-87ed-858c7643980e-var-run-ovn\") pod \"ovn-controller-czqbw-config-zpxp4\" (UID: \"988e3b90-6f2a-48fc-87ed-858c7643980e\") " pod="openstack/ovn-controller-czqbw-config-zpxp4" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.183790 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/988e3b90-6f2a-48fc-87ed-858c7643980e-scripts\") pod \"ovn-controller-czqbw-config-zpxp4\" (UID: \"988e3b90-6f2a-48fc-87ed-858c7643980e\") " pod="openstack/ovn-controller-czqbw-config-zpxp4" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.183823 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zssqf\" (UniqueName: \"kubernetes.io/projected/988e3b90-6f2a-48fc-87ed-858c7643980e-kube-api-access-zssqf\") pod \"ovn-controller-czqbw-config-zpxp4\" (UID: \"988e3b90-6f2a-48fc-87ed-858c7643980e\") " pod="openstack/ovn-controller-czqbw-config-zpxp4" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.183845 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/988e3b90-6f2a-48fc-87ed-858c7643980e-var-run\") pod \"ovn-controller-czqbw-config-zpxp4\" (UID: \"988e3b90-6f2a-48fc-87ed-858c7643980e\") " pod="openstack/ovn-controller-czqbw-config-zpxp4" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.183863 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/988e3b90-6f2a-48fc-87ed-858c7643980e-var-log-ovn\") pod \"ovn-controller-czqbw-config-zpxp4\" (UID: \"988e3b90-6f2a-48fc-87ed-858c7643980e\") " pod="openstack/ovn-controller-czqbw-config-zpxp4" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.183914 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/988e3b90-6f2a-48fc-87ed-858c7643980e-additional-scripts\") pod \"ovn-controller-czqbw-config-zpxp4\" (UID: \"988e3b90-6f2a-48fc-87ed-858c7643980e\") " pod="openstack/ovn-controller-czqbw-config-zpxp4" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.184835 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/988e3b90-6f2a-48fc-87ed-858c7643980e-additional-scripts\") pod \"ovn-controller-czqbw-config-zpxp4\" (UID: \"988e3b90-6f2a-48fc-87ed-858c7643980e\") " pod="openstack/ovn-controller-czqbw-config-zpxp4" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.185031 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/988e3b90-6f2a-48fc-87ed-858c7643980e-var-run-ovn\") pod \"ovn-controller-czqbw-config-zpxp4\" (UID: \"988e3b90-6f2a-48fc-87ed-858c7643980e\") " pod="openstack/ovn-controller-czqbw-config-zpxp4" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.187142 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/988e3b90-6f2a-48fc-87ed-858c7643980e-scripts\") pod \"ovn-controller-czqbw-config-zpxp4\" (UID: \"988e3b90-6f2a-48fc-87ed-858c7643980e\") " pod="openstack/ovn-controller-czqbw-config-zpxp4" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.187414 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/988e3b90-6f2a-48fc-87ed-858c7643980e-var-run\") pod \"ovn-controller-czqbw-config-zpxp4\" (UID: \"988e3b90-6f2a-48fc-87ed-858c7643980e\") " pod="openstack/ovn-controller-czqbw-config-zpxp4" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.187452 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/988e3b90-6f2a-48fc-87ed-858c7643980e-var-log-ovn\") pod \"ovn-controller-czqbw-config-zpxp4\" (UID: \"988e3b90-6f2a-48fc-87ed-858c7643980e\") " pod="openstack/ovn-controller-czqbw-config-zpxp4" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.211782 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c5a3-account-create-update-htbm6" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.216964 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zssqf\" (UniqueName: \"kubernetes.io/projected/988e3b90-6f2a-48fc-87ed-858c7643980e-kube-api-access-zssqf\") pod \"ovn-controller-czqbw-config-zpxp4\" (UID: \"988e3b90-6f2a-48fc-87ed-858c7643980e\") " pod="openstack/ovn-controller-czqbw-config-zpxp4" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.390924 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-czqbw-config-zpxp4" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.438814 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-kmlkm"] Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.525436 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94923d86-2e86-4f62-bf40-b5c44fa2eaa2" path="/var/lib/kubelet/pods/94923d86-2e86-4f62-bf40-b5c44fa2eaa2/volumes" Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.627944 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kmlkm" event={"ID":"436d1751-fb2f-45ab-a1c8-a64e3f8b628f","Type":"ContainerStarted","Data":"d252b188eda7df7d63a557c13317e736af1dd15eed1efc4132514748c8d2fafa"} Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.672983 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-468c-account-create-update-2r8fm"] Jan 10 16:43:34 crc kubenswrapper[5036]: I0110 16:43:34.769203 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-lk484"] Jan 10 16:43:35 crc kubenswrapper[5036]: I0110 16:43:35.180025 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-ls5rk"] Jan 10 16:43:35 crc kubenswrapper[5036]: W0110 16:43:35.185923 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd623293c_52c5_4236_9a80_1ac9af4517d4.slice/crio-db77b1ba8a410654b1b01b39f2218422f4a8a745258e9e089eb4ac124f72d37a WatchSource:0}: Error finding container db77b1ba8a410654b1b01b39f2218422f4a8a745258e9e089eb4ac124f72d37a: Status 404 returned error can't find the container with id db77b1ba8a410654b1b01b39f2218422f4a8a745258e9e089eb4ac124f72d37a Jan 10 16:43:35 crc kubenswrapper[5036]: I0110 16:43:35.229596 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qpfq8"] Jan 10 16:43:35 crc kubenswrapper[5036]: I0110 16:43:35.257453 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-21a4-account-create-update-qpvgs"] Jan 10 16:43:35 crc kubenswrapper[5036]: I0110 16:43:35.265394 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c5a3-account-create-update-htbm6"] Jan 10 16:43:35 crc kubenswrapper[5036]: I0110 16:43:35.457306 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-czqbw-config-zpxp4"] Jan 10 16:43:35 crc kubenswrapper[5036]: I0110 16:43:35.650572 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ls5rk" event={"ID":"9d244a3f-4202-469c-a576-b14fb7323180","Type":"ContainerStarted","Data":"e5444e16dd10b769b23e1520919e4d90fbef589ff09bdcffc5a4fd3a393bf19c"} Jan 10 16:43:35 crc kubenswrapper[5036]: I0110 16:43:35.651057 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ls5rk" event={"ID":"9d244a3f-4202-469c-a576-b14fb7323180","Type":"ContainerStarted","Data":"3be5684e79f7d3094c3733e4523c4c07b120d891c02da57e9f9f0f5ea59d26ff"} Jan 10 16:43:35 crc kubenswrapper[5036]: I0110 16:43:35.657764 5036 generic.go:334] "Generic (PLEG): container finished" podID="436d1751-fb2f-45ab-a1c8-a64e3f8b628f" containerID="c008e0fa37fd0ebe2bd5f950ad76e574f7be09aea9ba2c7447cb36a55bec9f8c" exitCode=0 Jan 10 16:43:35 crc kubenswrapper[5036]: I0110 16:43:35.657892 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kmlkm" event={"ID":"436d1751-fb2f-45ab-a1c8-a64e3f8b628f","Type":"ContainerDied","Data":"c008e0fa37fd0ebe2bd5f950ad76e574f7be09aea9ba2c7447cb36a55bec9f8c"} Jan 10 16:43:35 crc kubenswrapper[5036]: I0110 16:43:35.660228 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qpfq8" event={"ID":"d623293c-52c5-4236-9a80-1ac9af4517d4","Type":"ContainerStarted","Data":"db77b1ba8a410654b1b01b39f2218422f4a8a745258e9e089eb4ac124f72d37a"} Jan 10 16:43:35 crc kubenswrapper[5036]: I0110 16:43:35.663278 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-21a4-account-create-update-qpvgs" event={"ID":"cd7d3ebc-490f-4fbd-a86d-469b3c7f281c","Type":"ContainerStarted","Data":"79c3e8969c9f9b65d130f2990e182a31d8a891e52d1cfc216eca6a03eec32628"} Jan 10 16:43:35 crc kubenswrapper[5036]: I0110 16:43:35.663310 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-21a4-account-create-update-qpvgs" event={"ID":"cd7d3ebc-490f-4fbd-a86d-469b3c7f281c","Type":"ContainerStarted","Data":"ee3225ebf4c21b5ba0ebaeb0e0740240c97e1c68a45d4eaa6ff8bf57051b122d"} Jan 10 16:43:35 crc kubenswrapper[5036]: I0110 16:43:35.670808 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-ls5rk" podStartSLOduration=2.670791656 podStartE2EDuration="2.670791656s" podCreationTimestamp="2026-01-10 16:43:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:43:35.668215353 +0000 UTC m=+937.538450847" watchObservedRunningTime="2026-01-10 16:43:35.670791656 +0000 UTC m=+937.541027150" Jan 10 16:43:35 crc kubenswrapper[5036]: I0110 16:43:35.671103 5036 generic.go:334] "Generic (PLEG): container finished" podID="97027cc1-5cae-4bbf-8b11-5f7103ba4f09" containerID="9a4748ce01c603963f1cf607735e45a8b5c0430ba32a933bbc66d44ecda48ff5" exitCode=0 Jan 10 16:43:35 crc kubenswrapper[5036]: I0110 16:43:35.671243 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-468c-account-create-update-2r8fm" event={"ID":"97027cc1-5cae-4bbf-8b11-5f7103ba4f09","Type":"ContainerDied","Data":"9a4748ce01c603963f1cf607735e45a8b5c0430ba32a933bbc66d44ecda48ff5"} Jan 10 16:43:35 crc kubenswrapper[5036]: I0110 16:43:35.671277 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-468c-account-create-update-2r8fm" event={"ID":"97027cc1-5cae-4bbf-8b11-5f7103ba4f09","Type":"ContainerStarted","Data":"102f555f04fb7514be8747e657eee81e817459f4cc8bc1337edf724271ec258d"} Jan 10 16:43:35 crc kubenswrapper[5036]: I0110 16:43:35.672454 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c5a3-account-create-update-htbm6" event={"ID":"199f55a8-575e-4b45-add1-ed5d4da32d21","Type":"ContainerStarted","Data":"0c48c8c0e73c0ae93f5400b3f232a9ca88a7f2457336f8f5922b0151b1bc32fc"} Jan 10 16:43:35 crc kubenswrapper[5036]: I0110 16:43:35.672484 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c5a3-account-create-update-htbm6" event={"ID":"199f55a8-575e-4b45-add1-ed5d4da32d21","Type":"ContainerStarted","Data":"a46f3aacf62988fb2ba4721ca2c81c514a54a28b8ce7de540bbd15b1ac04c9fd"} Jan 10 16:43:35 crc kubenswrapper[5036]: I0110 16:43:35.675605 5036 generic.go:334] "Generic (PLEG): container finished" podID="9374853e-04a4-4903-877b-f725f5066bfc" containerID="a9b0fcb41924f931ce7ff7ff245f38c13cdd3e86e0a6f8203641370276edb5bc" exitCode=0 Jan 10 16:43:35 crc kubenswrapper[5036]: I0110 16:43:35.675653 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-lk484" event={"ID":"9374853e-04a4-4903-877b-f725f5066bfc","Type":"ContainerDied","Data":"a9b0fcb41924f931ce7ff7ff245f38c13cdd3e86e0a6f8203641370276edb5bc"} Jan 10 16:43:35 crc kubenswrapper[5036]: I0110 16:43:35.675696 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-lk484" event={"ID":"9374853e-04a4-4903-877b-f725f5066bfc","Type":"ContainerStarted","Data":"fce35448985860a7f4d8219b0fd4c4a7b0f471a53b27c339e4ef83bdb9a1bb58"} Jan 10 16:43:35 crc kubenswrapper[5036]: I0110 16:43:35.676933 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-czqbw-config-zpxp4" event={"ID":"988e3b90-6f2a-48fc-87ed-858c7643980e","Type":"ContainerStarted","Data":"1710748d280d929408ed1c64ceccd0cf8dfc234e767a3a83b07e65f2caf0fdda"} Jan 10 16:43:35 crc kubenswrapper[5036]: I0110 16:43:35.689557 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-21a4-account-create-update-qpvgs" podStartSLOduration=2.689543678 podStartE2EDuration="2.689543678s" podCreationTimestamp="2026-01-10 16:43:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:43:35.686868062 +0000 UTC m=+937.557103556" watchObservedRunningTime="2026-01-10 16:43:35.689543678 +0000 UTC m=+937.559779172" Jan 10 16:43:36 crc kubenswrapper[5036]: I0110 16:43:36.695738 5036 generic.go:334] "Generic (PLEG): container finished" podID="9d244a3f-4202-469c-a576-b14fb7323180" containerID="e5444e16dd10b769b23e1520919e4d90fbef589ff09bdcffc5a4fd3a393bf19c" exitCode=0 Jan 10 16:43:36 crc kubenswrapper[5036]: I0110 16:43:36.695977 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ls5rk" event={"ID":"9d244a3f-4202-469c-a576-b14fb7323180","Type":"ContainerDied","Data":"e5444e16dd10b769b23e1520919e4d90fbef589ff09bdcffc5a4fd3a393bf19c"} Jan 10 16:43:36 crc kubenswrapper[5036]: I0110 16:43:36.699085 5036 generic.go:334] "Generic (PLEG): container finished" podID="cd7d3ebc-490f-4fbd-a86d-469b3c7f281c" containerID="79c3e8969c9f9b65d130f2990e182a31d8a891e52d1cfc216eca6a03eec32628" exitCode=0 Jan 10 16:43:36 crc kubenswrapper[5036]: I0110 16:43:36.699244 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-21a4-account-create-update-qpvgs" event={"ID":"cd7d3ebc-490f-4fbd-a86d-469b3c7f281c","Type":"ContainerDied","Data":"79c3e8969c9f9b65d130f2990e182a31d8a891e52d1cfc216eca6a03eec32628"} Jan 10 16:43:36 crc kubenswrapper[5036]: I0110 16:43:36.700319 5036 generic.go:334] "Generic (PLEG): container finished" podID="199f55a8-575e-4b45-add1-ed5d4da32d21" containerID="0c48c8c0e73c0ae93f5400b3f232a9ca88a7f2457336f8f5922b0151b1bc32fc" exitCode=0 Jan 10 16:43:36 crc kubenswrapper[5036]: I0110 16:43:36.700526 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c5a3-account-create-update-htbm6" event={"ID":"199f55a8-575e-4b45-add1-ed5d4da32d21","Type":"ContainerDied","Data":"0c48c8c0e73c0ae93f5400b3f232a9ca88a7f2457336f8f5922b0151b1bc32fc"} Jan 10 16:43:36 crc kubenswrapper[5036]: I0110 16:43:36.701874 5036 generic.go:334] "Generic (PLEG): container finished" podID="988e3b90-6f2a-48fc-87ed-858c7643980e" containerID="cdf2698d6d411afc14345f2ad6de4d7166f342a99c8f717405de6bc8e2679a44" exitCode=0 Jan 10 16:43:36 crc kubenswrapper[5036]: I0110 16:43:36.702127 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-czqbw-config-zpxp4" event={"ID":"988e3b90-6f2a-48fc-87ed-858c7643980e","Type":"ContainerDied","Data":"cdf2698d6d411afc14345f2ad6de4d7166f342a99c8f717405de6bc8e2679a44"} Jan 10 16:43:36 crc kubenswrapper[5036]: I0110 16:43:36.713791 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-c5a3-account-create-update-htbm6" podStartSLOduration=3.713775304 podStartE2EDuration="3.713775304s" podCreationTimestamp="2026-01-10 16:43:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:43:35.755009302 +0000 UTC m=+937.625244796" watchObservedRunningTime="2026-01-10 16:43:36.713775304 +0000 UTC m=+938.584010798" Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.134907 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kmlkm" Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.147969 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lk484" Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.171118 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-468c-account-create-update-2r8fm" Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.178978 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9374853e-04a4-4903-877b-f725f5066bfc-operator-scripts\") pod \"9374853e-04a4-4903-877b-f725f5066bfc\" (UID: \"9374853e-04a4-4903-877b-f725f5066bfc\") " Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.179068 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpg6l\" (UniqueName: \"kubernetes.io/projected/9374853e-04a4-4903-877b-f725f5066bfc-kube-api-access-qpg6l\") pod \"9374853e-04a4-4903-877b-f725f5066bfc\" (UID: \"9374853e-04a4-4903-877b-f725f5066bfc\") " Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.179108 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/436d1751-fb2f-45ab-a1c8-a64e3f8b628f-operator-scripts\") pod \"436d1751-fb2f-45ab-a1c8-a64e3f8b628f\" (UID: \"436d1751-fb2f-45ab-a1c8-a64e3f8b628f\") " Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.179475 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v86zf\" (UniqueName: \"kubernetes.io/projected/436d1751-fb2f-45ab-a1c8-a64e3f8b628f-kube-api-access-v86zf\") pod \"436d1751-fb2f-45ab-a1c8-a64e3f8b628f\" (UID: \"436d1751-fb2f-45ab-a1c8-a64e3f8b628f\") " Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.179910 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9374853e-04a4-4903-877b-f725f5066bfc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9374853e-04a4-4903-877b-f725f5066bfc" (UID: "9374853e-04a4-4903-877b-f725f5066bfc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.180140 5036 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9374853e-04a4-4903-877b-f725f5066bfc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.180274 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/436d1751-fb2f-45ab-a1c8-a64e3f8b628f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "436d1751-fb2f-45ab-a1c8-a64e3f8b628f" (UID: "436d1751-fb2f-45ab-a1c8-a64e3f8b628f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.190828 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9374853e-04a4-4903-877b-f725f5066bfc-kube-api-access-qpg6l" (OuterVolumeSpecName: "kube-api-access-qpg6l") pod "9374853e-04a4-4903-877b-f725f5066bfc" (UID: "9374853e-04a4-4903-877b-f725f5066bfc"). InnerVolumeSpecName "kube-api-access-qpg6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.193030 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/436d1751-fb2f-45ab-a1c8-a64e3f8b628f-kube-api-access-v86zf" (OuterVolumeSpecName: "kube-api-access-v86zf") pod "436d1751-fb2f-45ab-a1c8-a64e3f8b628f" (UID: "436d1751-fb2f-45ab-a1c8-a64e3f8b628f"). InnerVolumeSpecName "kube-api-access-v86zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.281452 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmdlx\" (UniqueName: \"kubernetes.io/projected/97027cc1-5cae-4bbf-8b11-5f7103ba4f09-kube-api-access-tmdlx\") pod \"97027cc1-5cae-4bbf-8b11-5f7103ba4f09\" (UID: \"97027cc1-5cae-4bbf-8b11-5f7103ba4f09\") " Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.282089 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97027cc1-5cae-4bbf-8b11-5f7103ba4f09-operator-scripts\") pod \"97027cc1-5cae-4bbf-8b11-5f7103ba4f09\" (UID: \"97027cc1-5cae-4bbf-8b11-5f7103ba4f09\") " Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.282588 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97027cc1-5cae-4bbf-8b11-5f7103ba4f09-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "97027cc1-5cae-4bbf-8b11-5f7103ba4f09" (UID: "97027cc1-5cae-4bbf-8b11-5f7103ba4f09"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.282631 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v86zf\" (UniqueName: \"kubernetes.io/projected/436d1751-fb2f-45ab-a1c8-a64e3f8b628f-kube-api-access-v86zf\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.282688 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpg6l\" (UniqueName: \"kubernetes.io/projected/9374853e-04a4-4903-877b-f725f5066bfc-kube-api-access-qpg6l\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.282704 5036 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/436d1751-fb2f-45ab-a1c8-a64e3f8b628f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.285847 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97027cc1-5cae-4bbf-8b11-5f7103ba4f09-kube-api-access-tmdlx" (OuterVolumeSpecName: "kube-api-access-tmdlx") pod "97027cc1-5cae-4bbf-8b11-5f7103ba4f09" (UID: "97027cc1-5cae-4bbf-8b11-5f7103ba4f09"). InnerVolumeSpecName "kube-api-access-tmdlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.384292 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmdlx\" (UniqueName: \"kubernetes.io/projected/97027cc1-5cae-4bbf-8b11-5f7103ba4f09-kube-api-access-tmdlx\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.384341 5036 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97027cc1-5cae-4bbf-8b11-5f7103ba4f09-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.714289 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kmlkm" Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.714321 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kmlkm" event={"ID":"436d1751-fb2f-45ab-a1c8-a64e3f8b628f","Type":"ContainerDied","Data":"d252b188eda7df7d63a557c13317e736af1dd15eed1efc4132514748c8d2fafa"} Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.714363 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d252b188eda7df7d63a557c13317e736af1dd15eed1efc4132514748c8d2fafa" Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.718996 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-468c-account-create-update-2r8fm" event={"ID":"97027cc1-5cae-4bbf-8b11-5f7103ba4f09","Type":"ContainerDied","Data":"102f555f04fb7514be8747e657eee81e817459f4cc8bc1337edf724271ec258d"} Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.719019 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-468c-account-create-update-2r8fm" Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.719037 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="102f555f04fb7514be8747e657eee81e817459f4cc8bc1337edf724271ec258d" Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.721085 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-lk484" event={"ID":"9374853e-04a4-4903-877b-f725f5066bfc","Type":"ContainerDied","Data":"fce35448985860a7f4d8219b0fd4c4a7b0f471a53b27c339e4ef83bdb9a1bb58"} Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.721113 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fce35448985860a7f4d8219b0fd4c4a7b0f471a53b27c339e4ef83bdb9a1bb58" Jan 10 16:43:37 crc kubenswrapper[5036]: I0110 16:43:37.721358 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-lk484" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:37.999832 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-21a4-account-create-update-qpvgs" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.097531 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsbjn\" (UniqueName: \"kubernetes.io/projected/cd7d3ebc-490f-4fbd-a86d-469b3c7f281c-kube-api-access-qsbjn\") pod \"cd7d3ebc-490f-4fbd-a86d-469b3c7f281c\" (UID: \"cd7d3ebc-490f-4fbd-a86d-469b3c7f281c\") " Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.097650 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd7d3ebc-490f-4fbd-a86d-469b3c7f281c-operator-scripts\") pod \"cd7d3ebc-490f-4fbd-a86d-469b3c7f281c\" (UID: \"cd7d3ebc-490f-4fbd-a86d-469b3c7f281c\") " Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.098225 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd7d3ebc-490f-4fbd-a86d-469b3c7f281c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cd7d3ebc-490f-4fbd-a86d-469b3c7f281c" (UID: "cd7d3ebc-490f-4fbd-a86d-469b3c7f281c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.102912 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd7d3ebc-490f-4fbd-a86d-469b3c7f281c-kube-api-access-qsbjn" (OuterVolumeSpecName: "kube-api-access-qsbjn") pod "cd7d3ebc-490f-4fbd-a86d-469b3c7f281c" (UID: "cd7d3ebc-490f-4fbd-a86d-469b3c7f281c"). InnerVolumeSpecName "kube-api-access-qsbjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.186161 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c5a3-account-create-update-htbm6" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.201962 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsbjn\" (UniqueName: \"kubernetes.io/projected/cd7d3ebc-490f-4fbd-a86d-469b3c7f281c-kube-api-access-qsbjn\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.202044 5036 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd7d3ebc-490f-4fbd-a86d-469b3c7f281c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.202817 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ls5rk" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.214805 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-czqbw-config-zpxp4" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.303474 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/988e3b90-6f2a-48fc-87ed-858c7643980e-var-run-ovn\") pod \"988e3b90-6f2a-48fc-87ed-858c7643980e\" (UID: \"988e3b90-6f2a-48fc-87ed-858c7643980e\") " Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.303552 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d244a3f-4202-469c-a576-b14fb7323180-operator-scripts\") pod \"9d244a3f-4202-469c-a576-b14fb7323180\" (UID: \"9d244a3f-4202-469c-a576-b14fb7323180\") " Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.303596 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/988e3b90-6f2a-48fc-87ed-858c7643980e-additional-scripts\") pod \"988e3b90-6f2a-48fc-87ed-858c7643980e\" (UID: \"988e3b90-6f2a-48fc-87ed-858c7643980e\") " Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.303627 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/199f55a8-575e-4b45-add1-ed5d4da32d21-operator-scripts\") pod \"199f55a8-575e-4b45-add1-ed5d4da32d21\" (UID: \"199f55a8-575e-4b45-add1-ed5d4da32d21\") " Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.303660 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtp7k\" (UniqueName: \"kubernetes.io/projected/9d244a3f-4202-469c-a576-b14fb7323180-kube-api-access-jtp7k\") pod \"9d244a3f-4202-469c-a576-b14fb7323180\" (UID: \"9d244a3f-4202-469c-a576-b14fb7323180\") " Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.303761 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/988e3b90-6f2a-48fc-87ed-858c7643980e-scripts\") pod \"988e3b90-6f2a-48fc-87ed-858c7643980e\" (UID: \"988e3b90-6f2a-48fc-87ed-858c7643980e\") " Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.303777 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/988e3b90-6f2a-48fc-87ed-858c7643980e-var-run\") pod \"988e3b90-6f2a-48fc-87ed-858c7643980e\" (UID: \"988e3b90-6f2a-48fc-87ed-858c7643980e\") " Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.303798 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/988e3b90-6f2a-48fc-87ed-858c7643980e-var-log-ovn\") pod \"988e3b90-6f2a-48fc-87ed-858c7643980e\" (UID: \"988e3b90-6f2a-48fc-87ed-858c7643980e\") " Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.303834 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zssqf\" (UniqueName: \"kubernetes.io/projected/988e3b90-6f2a-48fc-87ed-858c7643980e-kube-api-access-zssqf\") pod \"988e3b90-6f2a-48fc-87ed-858c7643980e\" (UID: \"988e3b90-6f2a-48fc-87ed-858c7643980e\") " Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.303903 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpbll\" (UniqueName: \"kubernetes.io/projected/199f55a8-575e-4b45-add1-ed5d4da32d21-kube-api-access-rpbll\") pod \"199f55a8-575e-4b45-add1-ed5d4da32d21\" (UID: \"199f55a8-575e-4b45-add1-ed5d4da32d21\") " Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.304454 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/199f55a8-575e-4b45-add1-ed5d4da32d21-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "199f55a8-575e-4b45-add1-ed5d4da32d21" (UID: "199f55a8-575e-4b45-add1-ed5d4da32d21"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.304543 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d244a3f-4202-469c-a576-b14fb7323180-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9d244a3f-4202-469c-a576-b14fb7323180" (UID: "9d244a3f-4202-469c-a576-b14fb7323180"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.304593 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/988e3b90-6f2a-48fc-87ed-858c7643980e-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "988e3b90-6f2a-48fc-87ed-858c7643980e" (UID: "988e3b90-6f2a-48fc-87ed-858c7643980e"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.304612 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/988e3b90-6f2a-48fc-87ed-858c7643980e-var-run" (OuterVolumeSpecName: "var-run") pod "988e3b90-6f2a-48fc-87ed-858c7643980e" (UID: "988e3b90-6f2a-48fc-87ed-858c7643980e"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.304844 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/988e3b90-6f2a-48fc-87ed-858c7643980e-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "988e3b90-6f2a-48fc-87ed-858c7643980e" (UID: "988e3b90-6f2a-48fc-87ed-858c7643980e"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.305033 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/988e3b90-6f2a-48fc-87ed-858c7643980e-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "988e3b90-6f2a-48fc-87ed-858c7643980e" (UID: "988e3b90-6f2a-48fc-87ed-858c7643980e"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.305781 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/988e3b90-6f2a-48fc-87ed-858c7643980e-scripts" (OuterVolumeSpecName: "scripts") pod "988e3b90-6f2a-48fc-87ed-858c7643980e" (UID: "988e3b90-6f2a-48fc-87ed-858c7643980e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.308193 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/199f55a8-575e-4b45-add1-ed5d4da32d21-kube-api-access-rpbll" (OuterVolumeSpecName: "kube-api-access-rpbll") pod "199f55a8-575e-4b45-add1-ed5d4da32d21" (UID: "199f55a8-575e-4b45-add1-ed5d4da32d21"). InnerVolumeSpecName "kube-api-access-rpbll". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.309167 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d244a3f-4202-469c-a576-b14fb7323180-kube-api-access-jtp7k" (OuterVolumeSpecName: "kube-api-access-jtp7k") pod "9d244a3f-4202-469c-a576-b14fb7323180" (UID: "9d244a3f-4202-469c-a576-b14fb7323180"). InnerVolumeSpecName "kube-api-access-jtp7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.312256 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/988e3b90-6f2a-48fc-87ed-858c7643980e-kube-api-access-zssqf" (OuterVolumeSpecName: "kube-api-access-zssqf") pod "988e3b90-6f2a-48fc-87ed-858c7643980e" (UID: "988e3b90-6f2a-48fc-87ed-858c7643980e"). InnerVolumeSpecName "kube-api-access-zssqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.405886 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtp7k\" (UniqueName: \"kubernetes.io/projected/9d244a3f-4202-469c-a576-b14fb7323180-kube-api-access-jtp7k\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.405939 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/988e3b90-6f2a-48fc-87ed-858c7643980e-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.405948 5036 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/988e3b90-6f2a-48fc-87ed-858c7643980e-var-run\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.405963 5036 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/988e3b90-6f2a-48fc-87ed-858c7643980e-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.405972 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zssqf\" (UniqueName: \"kubernetes.io/projected/988e3b90-6f2a-48fc-87ed-858c7643980e-kube-api-access-zssqf\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.405988 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpbll\" (UniqueName: \"kubernetes.io/projected/199f55a8-575e-4b45-add1-ed5d4da32d21-kube-api-access-rpbll\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.406003 5036 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/988e3b90-6f2a-48fc-87ed-858c7643980e-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.406013 5036 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d244a3f-4202-469c-a576-b14fb7323180-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.406030 5036 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/988e3b90-6f2a-48fc-87ed-858c7643980e-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.406043 5036 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/199f55a8-575e-4b45-add1-ed5d4da32d21-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.729376 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-21a4-account-create-update-qpvgs" event={"ID":"cd7d3ebc-490f-4fbd-a86d-469b3c7f281c","Type":"ContainerDied","Data":"ee3225ebf4c21b5ba0ebaeb0e0740240c97e1c68a45d4eaa6ff8bf57051b122d"} Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.729401 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-21a4-account-create-update-qpvgs" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.729419 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee3225ebf4c21b5ba0ebaeb0e0740240c97e1c68a45d4eaa6ff8bf57051b122d" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.731830 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c5a3-account-create-update-htbm6" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.731856 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c5a3-account-create-update-htbm6" event={"ID":"199f55a8-575e-4b45-add1-ed5d4da32d21","Type":"ContainerDied","Data":"a46f3aacf62988fb2ba4721ca2c81c514a54a28b8ce7de540bbd15b1ac04c9fd"} Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.731905 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a46f3aacf62988fb2ba4721ca2c81c514a54a28b8ce7de540bbd15b1ac04c9fd" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.733779 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-czqbw-config-zpxp4" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.733785 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-czqbw-config-zpxp4" event={"ID":"988e3b90-6f2a-48fc-87ed-858c7643980e","Type":"ContainerDied","Data":"1710748d280d929408ed1c64ceccd0cf8dfc234e767a3a83b07e65f2caf0fdda"} Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.733813 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1710748d280d929408ed1c64ceccd0cf8dfc234e767a3a83b07e65f2caf0fdda" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.736616 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ls5rk" event={"ID":"9d244a3f-4202-469c-a576-b14fb7323180","Type":"ContainerDied","Data":"3be5684e79f7d3094c3733e4523c4c07b120d891c02da57e9f9f0f5ea59d26ff"} Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.736632 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ls5rk" Jan 10 16:43:38 crc kubenswrapper[5036]: I0110 16:43:38.736644 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3be5684e79f7d3094c3733e4523c4c07b120d891c02da57e9f9f0f5ea59d26ff" Jan 10 16:43:39 crc kubenswrapper[5036]: I0110 16:43:39.341948 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-czqbw-config-zpxp4"] Jan 10 16:43:39 crc kubenswrapper[5036]: I0110 16:43:39.350845 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-czqbw-config-zpxp4"] Jan 10 16:43:40 crc kubenswrapper[5036]: I0110 16:43:40.536414 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="988e3b90-6f2a-48fc-87ed-858c7643980e" path="/var/lib/kubelet/pods/988e3b90-6f2a-48fc-87ed-858c7643980e/volumes" Jan 10 16:43:43 crc kubenswrapper[5036]: I0110 16:43:43.787573 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qpfq8" event={"ID":"d623293c-52c5-4236-9a80-1ac9af4517d4","Type":"ContainerStarted","Data":"9d1d6f44d352d617ec142420b42426e14d335a6452b817af20cca30f4c6255b4"} Jan 10 16:43:43 crc kubenswrapper[5036]: I0110 16:43:43.813297 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-qpfq8" podStartSLOduration=2.526113788 podStartE2EDuration="10.813276541s" podCreationTimestamp="2026-01-10 16:43:33 +0000 UTC" firstStartedPulling="2026-01-10 16:43:35.190562932 +0000 UTC m=+937.060798426" lastFinishedPulling="2026-01-10 16:43:43.477725685 +0000 UTC m=+945.347961179" observedRunningTime="2026-01-10 16:43:43.810192473 +0000 UTC m=+945.680427977" watchObservedRunningTime="2026-01-10 16:43:43.813276541 +0000 UTC m=+945.683512045" Jan 10 16:43:46 crc kubenswrapper[5036]: I0110 16:43:46.812974 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lj72s" event={"ID":"09a8e315-dd60-47a9-b03c-0897b6f21b3d","Type":"ContainerStarted","Data":"8d47ab53ff73ece2438b5a690dec63225c2d89b69b409be461593a11cd0e4a87"} Jan 10 16:43:46 crc kubenswrapper[5036]: I0110 16:43:46.816474 5036 generic.go:334] "Generic (PLEG): container finished" podID="d623293c-52c5-4236-9a80-1ac9af4517d4" containerID="9d1d6f44d352d617ec142420b42426e14d335a6452b817af20cca30f4c6255b4" exitCode=0 Jan 10 16:43:46 crc kubenswrapper[5036]: I0110 16:43:46.816549 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qpfq8" event={"ID":"d623293c-52c5-4236-9a80-1ac9af4517d4","Type":"ContainerDied","Data":"9d1d6f44d352d617ec142420b42426e14d335a6452b817af20cca30f4c6255b4"} Jan 10 16:43:46 crc kubenswrapper[5036]: I0110 16:43:46.837397 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-lj72s" podStartSLOduration=2.312798082 podStartE2EDuration="30.837371562s" podCreationTimestamp="2026-01-10 16:43:16 +0000 UTC" firstStartedPulling="2026-01-10 16:43:17.522203915 +0000 UTC m=+919.392439409" lastFinishedPulling="2026-01-10 16:43:46.046777395 +0000 UTC m=+947.917012889" observedRunningTime="2026-01-10 16:43:46.832098173 +0000 UTC m=+948.702333687" watchObservedRunningTime="2026-01-10 16:43:46.837371562 +0000 UTC m=+948.707607086" Jan 10 16:43:48 crc kubenswrapper[5036]: I0110 16:43:48.105305 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qpfq8" Jan 10 16:43:48 crc kubenswrapper[5036]: I0110 16:43:48.281126 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4pvq\" (UniqueName: \"kubernetes.io/projected/d623293c-52c5-4236-9a80-1ac9af4517d4-kube-api-access-h4pvq\") pod \"d623293c-52c5-4236-9a80-1ac9af4517d4\" (UID: \"d623293c-52c5-4236-9a80-1ac9af4517d4\") " Jan 10 16:43:48 crc kubenswrapper[5036]: I0110 16:43:48.281190 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d623293c-52c5-4236-9a80-1ac9af4517d4-config-data\") pod \"d623293c-52c5-4236-9a80-1ac9af4517d4\" (UID: \"d623293c-52c5-4236-9a80-1ac9af4517d4\") " Jan 10 16:43:48 crc kubenswrapper[5036]: I0110 16:43:48.281220 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d623293c-52c5-4236-9a80-1ac9af4517d4-combined-ca-bundle\") pod \"d623293c-52c5-4236-9a80-1ac9af4517d4\" (UID: \"d623293c-52c5-4236-9a80-1ac9af4517d4\") " Jan 10 16:43:48 crc kubenswrapper[5036]: I0110 16:43:48.291959 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d623293c-52c5-4236-9a80-1ac9af4517d4-kube-api-access-h4pvq" (OuterVolumeSpecName: "kube-api-access-h4pvq") pod "d623293c-52c5-4236-9a80-1ac9af4517d4" (UID: "d623293c-52c5-4236-9a80-1ac9af4517d4"). InnerVolumeSpecName "kube-api-access-h4pvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:43:48 crc kubenswrapper[5036]: I0110 16:43:48.305545 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d623293c-52c5-4236-9a80-1ac9af4517d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d623293c-52c5-4236-9a80-1ac9af4517d4" (UID: "d623293c-52c5-4236-9a80-1ac9af4517d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:43:48 crc kubenswrapper[5036]: I0110 16:43:48.318977 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d623293c-52c5-4236-9a80-1ac9af4517d4-config-data" (OuterVolumeSpecName: "config-data") pod "d623293c-52c5-4236-9a80-1ac9af4517d4" (UID: "d623293c-52c5-4236-9a80-1ac9af4517d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:43:48 crc kubenswrapper[5036]: I0110 16:43:48.383038 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4pvq\" (UniqueName: \"kubernetes.io/projected/d623293c-52c5-4236-9a80-1ac9af4517d4-kube-api-access-h4pvq\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:48 crc kubenswrapper[5036]: I0110 16:43:48.383080 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d623293c-52c5-4236-9a80-1ac9af4517d4-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:48 crc kubenswrapper[5036]: I0110 16:43:48.383093 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d623293c-52c5-4236-9a80-1ac9af4517d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:48 crc kubenswrapper[5036]: I0110 16:43:48.835883 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qpfq8" event={"ID":"d623293c-52c5-4236-9a80-1ac9af4517d4","Type":"ContainerDied","Data":"db77b1ba8a410654b1b01b39f2218422f4a8a745258e9e089eb4ac124f72d37a"} Jan 10 16:43:48 crc kubenswrapper[5036]: I0110 16:43:48.835930 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db77b1ba8a410654b1b01b39f2218422f4a8a745258e9e089eb4ac124f72d37a" Jan 10 16:43:48 crc kubenswrapper[5036]: I0110 16:43:48.836290 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qpfq8" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.124274 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-kdqqh"] Jan 10 16:43:49 crc kubenswrapper[5036]: E0110 16:43:49.125313 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="988e3b90-6f2a-48fc-87ed-858c7643980e" containerName="ovn-config" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.125333 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="988e3b90-6f2a-48fc-87ed-858c7643980e" containerName="ovn-config" Jan 10 16:43:49 crc kubenswrapper[5036]: E0110 16:43:49.125346 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="199f55a8-575e-4b45-add1-ed5d4da32d21" containerName="mariadb-account-create-update" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.125352 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="199f55a8-575e-4b45-add1-ed5d4da32d21" containerName="mariadb-account-create-update" Jan 10 16:43:49 crc kubenswrapper[5036]: E0110 16:43:49.125373 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d623293c-52c5-4236-9a80-1ac9af4517d4" containerName="keystone-db-sync" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.125385 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="d623293c-52c5-4236-9a80-1ac9af4517d4" containerName="keystone-db-sync" Jan 10 16:43:49 crc kubenswrapper[5036]: E0110 16:43:49.125399 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9374853e-04a4-4903-877b-f725f5066bfc" containerName="mariadb-database-create" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.125408 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="9374853e-04a4-4903-877b-f725f5066bfc" containerName="mariadb-database-create" Jan 10 16:43:49 crc kubenswrapper[5036]: E0110 16:43:49.125432 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d244a3f-4202-469c-a576-b14fb7323180" containerName="mariadb-database-create" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.125439 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d244a3f-4202-469c-a576-b14fb7323180" containerName="mariadb-database-create" Jan 10 16:43:49 crc kubenswrapper[5036]: E0110 16:43:49.125446 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97027cc1-5cae-4bbf-8b11-5f7103ba4f09" containerName="mariadb-account-create-update" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.125452 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="97027cc1-5cae-4bbf-8b11-5f7103ba4f09" containerName="mariadb-account-create-update" Jan 10 16:43:49 crc kubenswrapper[5036]: E0110 16:43:49.125467 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="436d1751-fb2f-45ab-a1c8-a64e3f8b628f" containerName="mariadb-database-create" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.125473 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="436d1751-fb2f-45ab-a1c8-a64e3f8b628f" containerName="mariadb-database-create" Jan 10 16:43:49 crc kubenswrapper[5036]: E0110 16:43:49.125528 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd7d3ebc-490f-4fbd-a86d-469b3c7f281c" containerName="mariadb-account-create-update" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.125535 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd7d3ebc-490f-4fbd-a86d-469b3c7f281c" containerName="mariadb-account-create-update" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.125953 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d244a3f-4202-469c-a576-b14fb7323180" containerName="mariadb-database-create" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.125981 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="988e3b90-6f2a-48fc-87ed-858c7643980e" containerName="ovn-config" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.125990 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="9374853e-04a4-4903-877b-f725f5066bfc" containerName="mariadb-database-create" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.126006 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="d623293c-52c5-4236-9a80-1ac9af4517d4" containerName="keystone-db-sync" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.126022 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="97027cc1-5cae-4bbf-8b11-5f7103ba4f09" containerName="mariadb-account-create-update" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.126029 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd7d3ebc-490f-4fbd-a86d-469b3c7f281c" containerName="mariadb-account-create-update" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.126040 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="199f55a8-575e-4b45-add1-ed5d4da32d21" containerName="mariadb-account-create-update" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.126052 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="436d1751-fb2f-45ab-a1c8-a64e3f8b628f" containerName="mariadb-database-create" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.128378 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-kdqqh" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.194211 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-kdqqh"] Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.199132 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8da4ffe-27e8-460a-9639-1da9afda0a2d-config\") pod \"dnsmasq-dns-75bb4695fc-kdqqh\" (UID: \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\") " pod="openstack/dnsmasq-dns-75bb4695fc-kdqqh" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.199180 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8da4ffe-27e8-460a-9639-1da9afda0a2d-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb4695fc-kdqqh\" (UID: \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\") " pod="openstack/dnsmasq-dns-75bb4695fc-kdqqh" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.199207 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8da4ffe-27e8-460a-9639-1da9afda0a2d-dns-svc\") pod \"dnsmasq-dns-75bb4695fc-kdqqh\" (UID: \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\") " pod="openstack/dnsmasq-dns-75bb4695fc-kdqqh" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.199270 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8da4ffe-27e8-460a-9639-1da9afda0a2d-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb4695fc-kdqqh\" (UID: \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\") " pod="openstack/dnsmasq-dns-75bb4695fc-kdqqh" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.199303 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2xhn\" (UniqueName: \"kubernetes.io/projected/f8da4ffe-27e8-460a-9639-1da9afda0a2d-kube-api-access-c2xhn\") pod \"dnsmasq-dns-75bb4695fc-kdqqh\" (UID: \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\") " pod="openstack/dnsmasq-dns-75bb4695fc-kdqqh" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.214254 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-fz5f2"] Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.215598 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fz5f2" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.226001 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.226360 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.226563 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.226742 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.229268 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fz5f2"] Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.229553 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-n6pzn" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.302056 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8da4ffe-27e8-460a-9639-1da9afda0a2d-dns-svc\") pod \"dnsmasq-dns-75bb4695fc-kdqqh\" (UID: \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\") " pod="openstack/dnsmasq-dns-75bb4695fc-kdqqh" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.302144 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8da4ffe-27e8-460a-9639-1da9afda0a2d-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb4695fc-kdqqh\" (UID: \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\") " pod="openstack/dnsmasq-dns-75bb4695fc-kdqqh" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.302171 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2xhn\" (UniqueName: \"kubernetes.io/projected/f8da4ffe-27e8-460a-9639-1da9afda0a2d-kube-api-access-c2xhn\") pod \"dnsmasq-dns-75bb4695fc-kdqqh\" (UID: \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\") " pod="openstack/dnsmasq-dns-75bb4695fc-kdqqh" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.302223 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8da4ffe-27e8-460a-9639-1da9afda0a2d-config\") pod \"dnsmasq-dns-75bb4695fc-kdqqh\" (UID: \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\") " pod="openstack/dnsmasq-dns-75bb4695fc-kdqqh" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.302247 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8da4ffe-27e8-460a-9639-1da9afda0a2d-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb4695fc-kdqqh\" (UID: \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\") " pod="openstack/dnsmasq-dns-75bb4695fc-kdqqh" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.303182 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8da4ffe-27e8-460a-9639-1da9afda0a2d-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb4695fc-kdqqh\" (UID: \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\") " pod="openstack/dnsmasq-dns-75bb4695fc-kdqqh" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.304306 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8da4ffe-27e8-460a-9639-1da9afda0a2d-dns-svc\") pod \"dnsmasq-dns-75bb4695fc-kdqqh\" (UID: \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\") " pod="openstack/dnsmasq-dns-75bb4695fc-kdqqh" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.304821 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8da4ffe-27e8-460a-9639-1da9afda0a2d-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb4695fc-kdqqh\" (UID: \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\") " pod="openstack/dnsmasq-dns-75bb4695fc-kdqqh" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.307070 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8da4ffe-27e8-460a-9639-1da9afda0a2d-config\") pod \"dnsmasq-dns-75bb4695fc-kdqqh\" (UID: \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\") " pod="openstack/dnsmasq-dns-75bb4695fc-kdqqh" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.339511 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-bks65"] Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.340549 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bks65" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.342507 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bj2fv" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.342746 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.344569 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.352921 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2xhn\" (UniqueName: \"kubernetes.io/projected/f8da4ffe-27e8-460a-9639-1da9afda0a2d-kube-api-access-c2xhn\") pod \"dnsmasq-dns-75bb4695fc-kdqqh\" (UID: \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\") " pod="openstack/dnsmasq-dns-75bb4695fc-kdqqh" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.352983 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-j9crs"] Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.354000 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-j9crs" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.360548 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-6zp6p" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.360741 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.360840 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.368729 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-j9crs"] Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.377897 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-bks65"] Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.404552 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-credential-keys\") pod \"keystone-bootstrap-fz5f2\" (UID: \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\") " pod="openstack/keystone-bootstrap-fz5f2" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.404628 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9pvn\" (UniqueName: \"kubernetes.io/projected/820bd3cc-21aa-4dce-b73f-d2310f0a436a-kube-api-access-j9pvn\") pod \"keystone-bootstrap-fz5f2\" (UID: \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\") " pod="openstack/keystone-bootstrap-fz5f2" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.404669 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-combined-ca-bundle\") pod \"keystone-bootstrap-fz5f2\" (UID: \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\") " pod="openstack/keystone-bootstrap-fz5f2" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.404714 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-config-data\") pod \"keystone-bootstrap-fz5f2\" (UID: \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\") " pod="openstack/keystone-bootstrap-fz5f2" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.404742 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-scripts\") pod \"keystone-bootstrap-fz5f2\" (UID: \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\") " pod="openstack/keystone-bootstrap-fz5f2" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.404792 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-fernet-keys\") pod \"keystone-bootstrap-fz5f2\" (UID: \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\") " pod="openstack/keystone-bootstrap-fz5f2" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.462177 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-kdqqh" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.481749 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-c9rbf"] Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.482834 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-c9rbf" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.485476 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.485721 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-q796t" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.492846 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-kdqqh"] Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.505900 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-credential-keys\") pod \"keystone-bootstrap-fz5f2\" (UID: \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\") " pod="openstack/keystone-bootstrap-fz5f2" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.505952 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1899d96-c3b2-415c-b1fd-7c2847da4370-combined-ca-bundle\") pod \"neutron-db-sync-bks65\" (UID: \"b1899d96-c3b2-415c-b1fd-7c2847da4370\") " pod="openstack/neutron-db-sync-bks65" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.505982 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9pvn\" (UniqueName: \"kubernetes.io/projected/820bd3cc-21aa-4dce-b73f-d2310f0a436a-kube-api-access-j9pvn\") pod \"keystone-bootstrap-fz5f2\" (UID: \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\") " pod="openstack/keystone-bootstrap-fz5f2" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.506018 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-combined-ca-bundle\") pod \"keystone-bootstrap-fz5f2\" (UID: \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\") " pod="openstack/keystone-bootstrap-fz5f2" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.506052 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-config-data\") pod \"keystone-bootstrap-fz5f2\" (UID: \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\") " pod="openstack/keystone-bootstrap-fz5f2" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.506074 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvndh\" (UniqueName: \"kubernetes.io/projected/b1899d96-c3b2-415c-b1fd-7c2847da4370-kube-api-access-zvndh\") pod \"neutron-db-sync-bks65\" (UID: \"b1899d96-c3b2-415c-b1fd-7c2847da4370\") " pod="openstack/neutron-db-sync-bks65" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.506108 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-scripts\") pod \"keystone-bootstrap-fz5f2\" (UID: \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\") " pod="openstack/keystone-bootstrap-fz5f2" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.506135 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b1899d96-c3b2-415c-b1fd-7c2847da4370-config\") pod \"neutron-db-sync-bks65\" (UID: \"b1899d96-c3b2-415c-b1fd-7c2847da4370\") " pod="openstack/neutron-db-sync-bks65" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.506173 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe6dd46-603d-4595-ad27-32f98623fbcc-combined-ca-bundle\") pod \"cinder-db-sync-j9crs\" (UID: \"6fe6dd46-603d-4595-ad27-32f98623fbcc\") " pod="openstack/cinder-db-sync-j9crs" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.506204 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lt62\" (UniqueName: \"kubernetes.io/projected/6fe6dd46-603d-4595-ad27-32f98623fbcc-kube-api-access-5lt62\") pod \"cinder-db-sync-j9crs\" (UID: \"6fe6dd46-603d-4595-ad27-32f98623fbcc\") " pod="openstack/cinder-db-sync-j9crs" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.506241 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-fernet-keys\") pod \"keystone-bootstrap-fz5f2\" (UID: \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\") " pod="openstack/keystone-bootstrap-fz5f2" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.506269 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe6dd46-603d-4595-ad27-32f98623fbcc-config-data\") pod \"cinder-db-sync-j9crs\" (UID: \"6fe6dd46-603d-4595-ad27-32f98623fbcc\") " pod="openstack/cinder-db-sync-j9crs" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.506290 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fe6dd46-603d-4595-ad27-32f98623fbcc-scripts\") pod \"cinder-db-sync-j9crs\" (UID: \"6fe6dd46-603d-4595-ad27-32f98623fbcc\") " pod="openstack/cinder-db-sync-j9crs" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.506328 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6fe6dd46-603d-4595-ad27-32f98623fbcc-etc-machine-id\") pod \"cinder-db-sync-j9crs\" (UID: \"6fe6dd46-603d-4595-ad27-32f98623fbcc\") " pod="openstack/cinder-db-sync-j9crs" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.506350 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6fe6dd46-603d-4595-ad27-32f98623fbcc-db-sync-config-data\") pod \"cinder-db-sync-j9crs\" (UID: \"6fe6dd46-603d-4595-ad27-32f98623fbcc\") " pod="openstack/cinder-db-sync-j9crs" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.513525 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-config-data\") pod \"keystone-bootstrap-fz5f2\" (UID: \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\") " pod="openstack/keystone-bootstrap-fz5f2" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.517262 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-fernet-keys\") pod \"keystone-bootstrap-fz5f2\" (UID: \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\") " pod="openstack/keystone-bootstrap-fz5f2" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.520909 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-combined-ca-bundle\") pod \"keystone-bootstrap-fz5f2\" (UID: \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\") " pod="openstack/keystone-bootstrap-fz5f2" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.526110 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-scripts\") pod \"keystone-bootstrap-fz5f2\" (UID: \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\") " pod="openstack/keystone-bootstrap-fz5f2" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.526775 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-credential-keys\") pod \"keystone-bootstrap-fz5f2\" (UID: \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\") " pod="openstack/keystone-bootstrap-fz5f2" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.533303 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-c9rbf"] Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.550737 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9pvn\" (UniqueName: \"kubernetes.io/projected/820bd3cc-21aa-4dce-b73f-d2310f0a436a-kube-api-access-j9pvn\") pod \"keystone-bootstrap-fz5f2\" (UID: \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\") " pod="openstack/keystone-bootstrap-fz5f2" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.561027 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-wwspw"] Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.562065 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wwspw" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.565506 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fz5f2" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.569845 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.570054 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-zzv57" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.570168 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.600923 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-wwspw"] Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.608457 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1899d96-c3b2-415c-b1fd-7c2847da4370-combined-ca-bundle\") pod \"neutron-db-sync-bks65\" (UID: \"b1899d96-c3b2-415c-b1fd-7c2847da4370\") " pod="openstack/neutron-db-sync-bks65" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.608618 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c-db-sync-config-data\") pod \"barbican-db-sync-c9rbf\" (UID: \"0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c\") " pod="openstack/barbican-db-sync-c9rbf" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.608657 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c-combined-ca-bundle\") pod \"barbican-db-sync-c9rbf\" (UID: \"0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c\") " pod="openstack/barbican-db-sync-c9rbf" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.608715 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvndh\" (UniqueName: \"kubernetes.io/projected/b1899d96-c3b2-415c-b1fd-7c2847da4370-kube-api-access-zvndh\") pod \"neutron-db-sync-bks65\" (UID: \"b1899d96-c3b2-415c-b1fd-7c2847da4370\") " pod="openstack/neutron-db-sync-bks65" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.608757 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b1899d96-c3b2-415c-b1fd-7c2847da4370-config\") pod \"neutron-db-sync-bks65\" (UID: \"b1899d96-c3b2-415c-b1fd-7c2847da4370\") " pod="openstack/neutron-db-sync-bks65" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.608783 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe6dd46-603d-4595-ad27-32f98623fbcc-combined-ca-bundle\") pod \"cinder-db-sync-j9crs\" (UID: \"6fe6dd46-603d-4595-ad27-32f98623fbcc\") " pod="openstack/cinder-db-sync-j9crs" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.608806 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lt62\" (UniqueName: \"kubernetes.io/projected/6fe6dd46-603d-4595-ad27-32f98623fbcc-kube-api-access-5lt62\") pod \"cinder-db-sync-j9crs\" (UID: \"6fe6dd46-603d-4595-ad27-32f98623fbcc\") " pod="openstack/cinder-db-sync-j9crs" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.608945 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe6dd46-603d-4595-ad27-32f98623fbcc-config-data\") pod \"cinder-db-sync-j9crs\" (UID: \"6fe6dd46-603d-4595-ad27-32f98623fbcc\") " pod="openstack/cinder-db-sync-j9crs" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.608966 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fe6dd46-603d-4595-ad27-32f98623fbcc-scripts\") pod \"cinder-db-sync-j9crs\" (UID: \"6fe6dd46-603d-4595-ad27-32f98623fbcc\") " pod="openstack/cinder-db-sync-j9crs" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.609012 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6fe6dd46-603d-4595-ad27-32f98623fbcc-etc-machine-id\") pod \"cinder-db-sync-j9crs\" (UID: \"6fe6dd46-603d-4595-ad27-32f98623fbcc\") " pod="openstack/cinder-db-sync-j9crs" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.609032 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6fe6dd46-603d-4595-ad27-32f98623fbcc-db-sync-config-data\") pod \"cinder-db-sync-j9crs\" (UID: \"6fe6dd46-603d-4595-ad27-32f98623fbcc\") " pod="openstack/cinder-db-sync-j9crs" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.609057 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jksk\" (UniqueName: \"kubernetes.io/projected/0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c-kube-api-access-4jksk\") pod \"barbican-db-sync-c9rbf\" (UID: \"0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c\") " pod="openstack/barbican-db-sync-c9rbf" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.615414 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b1899d96-c3b2-415c-b1fd-7c2847da4370-config\") pod \"neutron-db-sync-bks65\" (UID: \"b1899d96-c3b2-415c-b1fd-7c2847da4370\") " pod="openstack/neutron-db-sync-bks65" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.619320 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6fe6dd46-603d-4595-ad27-32f98623fbcc-etc-machine-id\") pod \"cinder-db-sync-j9crs\" (UID: \"6fe6dd46-603d-4595-ad27-32f98623fbcc\") " pod="openstack/cinder-db-sync-j9crs" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.622469 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fe6dd46-603d-4595-ad27-32f98623fbcc-scripts\") pod \"cinder-db-sync-j9crs\" (UID: \"6fe6dd46-603d-4595-ad27-32f98623fbcc\") " pod="openstack/cinder-db-sync-j9crs" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.627689 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe6dd46-603d-4595-ad27-32f98623fbcc-config-data\") pod \"cinder-db-sync-j9crs\" (UID: \"6fe6dd46-603d-4595-ad27-32f98623fbcc\") " pod="openstack/cinder-db-sync-j9crs" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.652082 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe6dd46-603d-4595-ad27-32f98623fbcc-combined-ca-bundle\") pod \"cinder-db-sync-j9crs\" (UID: \"6fe6dd46-603d-4595-ad27-32f98623fbcc\") " pod="openstack/cinder-db-sync-j9crs" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.652580 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6fe6dd46-603d-4595-ad27-32f98623fbcc-db-sync-config-data\") pod \"cinder-db-sync-j9crs\" (UID: \"6fe6dd46-603d-4595-ad27-32f98623fbcc\") " pod="openstack/cinder-db-sync-j9crs" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.657428 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-vrp42"] Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.658396 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lt62\" (UniqueName: \"kubernetes.io/projected/6fe6dd46-603d-4595-ad27-32f98623fbcc-kube-api-access-5lt62\") pod \"cinder-db-sync-j9crs\" (UID: \"6fe6dd46-603d-4595-ad27-32f98623fbcc\") " pod="openstack/cinder-db-sync-j9crs" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.659338 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvndh\" (UniqueName: \"kubernetes.io/projected/b1899d96-c3b2-415c-b1fd-7c2847da4370-kube-api-access-zvndh\") pod \"neutron-db-sync-bks65\" (UID: \"b1899d96-c3b2-415c-b1fd-7c2847da4370\") " pod="openstack/neutron-db-sync-bks65" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.659916 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.660239 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1899d96-c3b2-415c-b1fd-7c2847da4370-combined-ca-bundle\") pod \"neutron-db-sync-bks65\" (UID: \"b1899d96-c3b2-415c-b1fd-7c2847da4370\") " pod="openstack/neutron-db-sync-bks65" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.692529 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-vrp42"] Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.712109 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jksk\" (UniqueName: \"kubernetes.io/projected/0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c-kube-api-access-4jksk\") pod \"barbican-db-sync-c9rbf\" (UID: \"0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c\") " pod="openstack/barbican-db-sync-c9rbf" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.712154 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.712162 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b4df096-e6ee-47df-a4ae-035aeade27a6-combined-ca-bundle\") pod \"placement-db-sync-wwspw\" (UID: \"7b4df096-e6ee-47df-a4ae-035aeade27a6\") " pod="openstack/placement-db-sync-wwspw" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.713503 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b4df096-e6ee-47df-a4ae-035aeade27a6-config-data\") pod \"placement-db-sync-wwspw\" (UID: \"7b4df096-e6ee-47df-a4ae-035aeade27a6\") " pod="openstack/placement-db-sync-wwspw" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.713558 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c-db-sync-config-data\") pod \"barbican-db-sync-c9rbf\" (UID: \"0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c\") " pod="openstack/barbican-db-sync-c9rbf" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.713605 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c-combined-ca-bundle\") pod \"barbican-db-sync-c9rbf\" (UID: \"0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c\") " pod="openstack/barbican-db-sync-c9rbf" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.713627 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b4df096-e6ee-47df-a4ae-035aeade27a6-scripts\") pod \"placement-db-sync-wwspw\" (UID: \"7b4df096-e6ee-47df-a4ae-035aeade27a6\") " pod="openstack/placement-db-sync-wwspw" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.713649 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b4df096-e6ee-47df-a4ae-035aeade27a6-logs\") pod \"placement-db-sync-wwspw\" (UID: \"7b4df096-e6ee-47df-a4ae-035aeade27a6\") " pod="openstack/placement-db-sync-wwspw" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.713812 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw8gn\" (UniqueName: \"kubernetes.io/projected/7b4df096-e6ee-47df-a4ae-035aeade27a6-kube-api-access-qw8gn\") pod \"placement-db-sync-wwspw\" (UID: \"7b4df096-e6ee-47df-a4ae-035aeade27a6\") " pod="openstack/placement-db-sync-wwspw" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.714760 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.718904 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c-db-sync-config-data\") pod \"barbican-db-sync-c9rbf\" (UID: \"0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c\") " pod="openstack/barbican-db-sync-c9rbf" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.719259 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.719625 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bks65" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.720126 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-j9crs" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.728772 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.729039 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.738391 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c-combined-ca-bundle\") pod \"barbican-db-sync-c9rbf\" (UID: \"0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c\") " pod="openstack/barbican-db-sync-c9rbf" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.754575 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jksk\" (UniqueName: \"kubernetes.io/projected/0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c-kube-api-access-4jksk\") pod \"barbican-db-sync-c9rbf\" (UID: \"0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c\") " pod="openstack/barbican-db-sync-c9rbf" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.815426 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw8gn\" (UniqueName: \"kubernetes.io/projected/7b4df096-e6ee-47df-a4ae-035aeade27a6-kube-api-access-qw8gn\") pod \"placement-db-sync-wwspw\" (UID: \"7b4df096-e6ee-47df-a4ae-035aeade27a6\") " pod="openstack/placement-db-sync-wwspw" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.815496 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd18657-f02e-4b2f-8ec6-e46b2408e720-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " pod="openstack/ceilometer-0" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.815546 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acd18657-f02e-4b2f-8ec6-e46b2408e720-log-httpd\") pod \"ceilometer-0\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " pod="openstack/ceilometer-0" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.815595 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b4df096-e6ee-47df-a4ae-035aeade27a6-combined-ca-bundle\") pod \"placement-db-sync-wwspw\" (UID: \"7b4df096-e6ee-47df-a4ae-035aeade27a6\") " pod="openstack/placement-db-sync-wwspw" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.815666 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b4df096-e6ee-47df-a4ae-035aeade27a6-config-data\") pod \"placement-db-sync-wwspw\" (UID: \"7b4df096-e6ee-47df-a4ae-035aeade27a6\") " pod="openstack/placement-db-sync-wwspw" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.815724 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7789491d-15dc-44e0-88d9-141ba4009010-ovsdbserver-sb\") pod \"dnsmasq-dns-745b9ddc8c-vrp42\" (UID: \"7789491d-15dc-44e0-88d9-141ba4009010\") " pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.815773 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7789491d-15dc-44e0-88d9-141ba4009010-config\") pod \"dnsmasq-dns-745b9ddc8c-vrp42\" (UID: \"7789491d-15dc-44e0-88d9-141ba4009010\") " pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.815797 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acd18657-f02e-4b2f-8ec6-e46b2408e720-run-httpd\") pod \"ceilometer-0\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " pod="openstack/ceilometer-0" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.815821 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b4df096-e6ee-47df-a4ae-035aeade27a6-scripts\") pod \"placement-db-sync-wwspw\" (UID: \"7b4df096-e6ee-47df-a4ae-035aeade27a6\") " pod="openstack/placement-db-sync-wwspw" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.815847 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b4df096-e6ee-47df-a4ae-035aeade27a6-logs\") pod \"placement-db-sync-wwspw\" (UID: \"7b4df096-e6ee-47df-a4ae-035aeade27a6\") " pod="openstack/placement-db-sync-wwspw" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.815879 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7789491d-15dc-44e0-88d9-141ba4009010-ovsdbserver-nb\") pod \"dnsmasq-dns-745b9ddc8c-vrp42\" (UID: \"7789491d-15dc-44e0-88d9-141ba4009010\") " pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.815909 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd18657-f02e-4b2f-8ec6-e46b2408e720-config-data\") pod \"ceilometer-0\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " pod="openstack/ceilometer-0" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.815933 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfdn5\" (UniqueName: \"kubernetes.io/projected/acd18657-f02e-4b2f-8ec6-e46b2408e720-kube-api-access-mfdn5\") pod \"ceilometer-0\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " pod="openstack/ceilometer-0" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.815965 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7789491d-15dc-44e0-88d9-141ba4009010-dns-svc\") pod \"dnsmasq-dns-745b9ddc8c-vrp42\" (UID: \"7789491d-15dc-44e0-88d9-141ba4009010\") " pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.815990 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acd18657-f02e-4b2f-8ec6-e46b2408e720-scripts\") pod \"ceilometer-0\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " pod="openstack/ceilometer-0" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.816019 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7prxr\" (UniqueName: \"kubernetes.io/projected/7789491d-15dc-44e0-88d9-141ba4009010-kube-api-access-7prxr\") pod \"dnsmasq-dns-745b9ddc8c-vrp42\" (UID: \"7789491d-15dc-44e0-88d9-141ba4009010\") " pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.816046 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/acd18657-f02e-4b2f-8ec6-e46b2408e720-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " pod="openstack/ceilometer-0" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.821438 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b4df096-e6ee-47df-a4ae-035aeade27a6-combined-ca-bundle\") pod \"placement-db-sync-wwspw\" (UID: \"7b4df096-e6ee-47df-a4ae-035aeade27a6\") " pod="openstack/placement-db-sync-wwspw" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.843192 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b4df096-e6ee-47df-a4ae-035aeade27a6-config-data\") pod \"placement-db-sync-wwspw\" (UID: \"7b4df096-e6ee-47df-a4ae-035aeade27a6\") " pod="openstack/placement-db-sync-wwspw" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.845370 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b4df096-e6ee-47df-a4ae-035aeade27a6-scripts\") pod \"placement-db-sync-wwspw\" (UID: \"7b4df096-e6ee-47df-a4ae-035aeade27a6\") " pod="openstack/placement-db-sync-wwspw" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.845647 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b4df096-e6ee-47df-a4ae-035aeade27a6-logs\") pod \"placement-db-sync-wwspw\" (UID: \"7b4df096-e6ee-47df-a4ae-035aeade27a6\") " pod="openstack/placement-db-sync-wwspw" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.849088 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw8gn\" (UniqueName: \"kubernetes.io/projected/7b4df096-e6ee-47df-a4ae-035aeade27a6-kube-api-access-qw8gn\") pod \"placement-db-sync-wwspw\" (UID: \"7b4df096-e6ee-47df-a4ae-035aeade27a6\") " pod="openstack/placement-db-sync-wwspw" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.911478 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-c9rbf" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.917842 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7789491d-15dc-44e0-88d9-141ba4009010-ovsdbserver-nb\") pod \"dnsmasq-dns-745b9ddc8c-vrp42\" (UID: \"7789491d-15dc-44e0-88d9-141ba4009010\") " pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.917912 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd18657-f02e-4b2f-8ec6-e46b2408e720-config-data\") pod \"ceilometer-0\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " pod="openstack/ceilometer-0" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.917941 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfdn5\" (UniqueName: \"kubernetes.io/projected/acd18657-f02e-4b2f-8ec6-e46b2408e720-kube-api-access-mfdn5\") pod \"ceilometer-0\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " pod="openstack/ceilometer-0" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.917969 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7789491d-15dc-44e0-88d9-141ba4009010-dns-svc\") pod \"dnsmasq-dns-745b9ddc8c-vrp42\" (UID: \"7789491d-15dc-44e0-88d9-141ba4009010\") " pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.917997 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acd18657-f02e-4b2f-8ec6-e46b2408e720-scripts\") pod \"ceilometer-0\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " pod="openstack/ceilometer-0" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.918029 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7prxr\" (UniqueName: \"kubernetes.io/projected/7789491d-15dc-44e0-88d9-141ba4009010-kube-api-access-7prxr\") pod \"dnsmasq-dns-745b9ddc8c-vrp42\" (UID: \"7789491d-15dc-44e0-88d9-141ba4009010\") " pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.918060 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/acd18657-f02e-4b2f-8ec6-e46b2408e720-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " pod="openstack/ceilometer-0" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.918126 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd18657-f02e-4b2f-8ec6-e46b2408e720-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " pod="openstack/ceilometer-0" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.918170 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acd18657-f02e-4b2f-8ec6-e46b2408e720-log-httpd\") pod \"ceilometer-0\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " pod="openstack/ceilometer-0" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.918236 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7789491d-15dc-44e0-88d9-141ba4009010-ovsdbserver-sb\") pod \"dnsmasq-dns-745b9ddc8c-vrp42\" (UID: \"7789491d-15dc-44e0-88d9-141ba4009010\") " pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.918273 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7789491d-15dc-44e0-88d9-141ba4009010-config\") pod \"dnsmasq-dns-745b9ddc8c-vrp42\" (UID: \"7789491d-15dc-44e0-88d9-141ba4009010\") " pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.918292 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acd18657-f02e-4b2f-8ec6-e46b2408e720-run-httpd\") pod \"ceilometer-0\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " pod="openstack/ceilometer-0" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.918868 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acd18657-f02e-4b2f-8ec6-e46b2408e720-run-httpd\") pod \"ceilometer-0\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " pod="openstack/ceilometer-0" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.921574 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7789491d-15dc-44e0-88d9-141ba4009010-ovsdbserver-nb\") pod \"dnsmasq-dns-745b9ddc8c-vrp42\" (UID: \"7789491d-15dc-44e0-88d9-141ba4009010\") " pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.922125 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7789491d-15dc-44e0-88d9-141ba4009010-dns-svc\") pod \"dnsmasq-dns-745b9ddc8c-vrp42\" (UID: \"7789491d-15dc-44e0-88d9-141ba4009010\") " pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.922411 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7789491d-15dc-44e0-88d9-141ba4009010-config\") pod \"dnsmasq-dns-745b9ddc8c-vrp42\" (UID: \"7789491d-15dc-44e0-88d9-141ba4009010\") " pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.922452 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7789491d-15dc-44e0-88d9-141ba4009010-ovsdbserver-sb\") pod \"dnsmasq-dns-745b9ddc8c-vrp42\" (UID: \"7789491d-15dc-44e0-88d9-141ba4009010\") " pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.922871 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acd18657-f02e-4b2f-8ec6-e46b2408e720-log-httpd\") pod \"ceilometer-0\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " pod="openstack/ceilometer-0" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.925602 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acd18657-f02e-4b2f-8ec6-e46b2408e720-scripts\") pod \"ceilometer-0\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " pod="openstack/ceilometer-0" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.934065 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd18657-f02e-4b2f-8ec6-e46b2408e720-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " pod="openstack/ceilometer-0" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.937241 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd18657-f02e-4b2f-8ec6-e46b2408e720-config-data\") pod \"ceilometer-0\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " pod="openstack/ceilometer-0" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.946516 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/acd18657-f02e-4b2f-8ec6-e46b2408e720-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " pod="openstack/ceilometer-0" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.948110 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7prxr\" (UniqueName: \"kubernetes.io/projected/7789491d-15dc-44e0-88d9-141ba4009010-kube-api-access-7prxr\") pod \"dnsmasq-dns-745b9ddc8c-vrp42\" (UID: \"7789491d-15dc-44e0-88d9-141ba4009010\") " pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.949818 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfdn5\" (UniqueName: \"kubernetes.io/projected/acd18657-f02e-4b2f-8ec6-e46b2408e720-kube-api-access-mfdn5\") pod \"ceilometer-0\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " pod="openstack/ceilometer-0" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.961109 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wwspw" Jan 10 16:43:49 crc kubenswrapper[5036]: I0110 16:43:49.983519 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" Jan 10 16:43:50 crc kubenswrapper[5036]: I0110 16:43:50.063466 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 16:43:50 crc kubenswrapper[5036]: I0110 16:43:50.253794 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-kdqqh"] Jan 10 16:43:50 crc kubenswrapper[5036]: I0110 16:43:50.473396 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fz5f2"] Jan 10 16:43:50 crc kubenswrapper[5036]: I0110 16:43:50.777013 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-c9rbf"] Jan 10 16:43:50 crc kubenswrapper[5036]: I0110 16:43:50.780072 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-j9crs"] Jan 10 16:43:50 crc kubenswrapper[5036]: I0110 16:43:50.790143 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-vrp42"] Jan 10 16:43:50 crc kubenswrapper[5036]: W0110 16:43:50.827787 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1899d96_c3b2_415c_b1fd_7c2847da4370.slice/crio-5267c48307419dbed415895a02f8b32125c473b1893b803c0bd37456075a80ce WatchSource:0}: Error finding container 5267c48307419dbed415895a02f8b32125c473b1893b803c0bd37456075a80ce: Status 404 returned error can't find the container with id 5267c48307419dbed415895a02f8b32125c473b1893b803c0bd37456075a80ce Jan 10 16:43:50 crc kubenswrapper[5036]: I0110 16:43:50.834982 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-bks65"] Jan 10 16:43:50 crc kubenswrapper[5036]: I0110 16:43:50.863741 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-wwspw"] Jan 10 16:43:50 crc kubenswrapper[5036]: I0110 16:43:50.872942 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:43:50 crc kubenswrapper[5036]: I0110 16:43:50.897953 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-j9crs" event={"ID":"6fe6dd46-603d-4595-ad27-32f98623fbcc","Type":"ContainerStarted","Data":"4686c23e6ac940c49e70fd5772cc4abb964c2a93cf919775231b8ac1c5e53870"} Jan 10 16:43:50 crc kubenswrapper[5036]: I0110 16:43:50.904222 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acd18657-f02e-4b2f-8ec6-e46b2408e720","Type":"ContainerStarted","Data":"04924f97357adea774932af9153dad042bc0009a7d2cfe2cd95279c95090affa"} Jan 10 16:43:50 crc kubenswrapper[5036]: I0110 16:43:50.908397 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wwspw" event={"ID":"7b4df096-e6ee-47df-a4ae-035aeade27a6","Type":"ContainerStarted","Data":"328c9ee06832297cb6657c7218bd7cc85970b8be63ed6a69e57cf9d9abdae835"} Jan 10 16:43:50 crc kubenswrapper[5036]: I0110 16:43:50.910993 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-c9rbf" event={"ID":"0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c","Type":"ContainerStarted","Data":"c0fd535b2a0707eb2f6c0883417a35643945311a7657e29f12851a1cb8c15df6"} Jan 10 16:43:50 crc kubenswrapper[5036]: I0110 16:43:50.912042 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bks65" event={"ID":"b1899d96-c3b2-415c-b1fd-7c2847da4370","Type":"ContainerStarted","Data":"5267c48307419dbed415895a02f8b32125c473b1893b803c0bd37456075a80ce"} Jan 10 16:43:50 crc kubenswrapper[5036]: I0110 16:43:50.915670 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:43:50 crc kubenswrapper[5036]: I0110 16:43:50.928042 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fz5f2" event={"ID":"820bd3cc-21aa-4dce-b73f-d2310f0a436a","Type":"ContainerStarted","Data":"9b50602cb81b4a10f23f70316e09fc6975bf9fbf04c93e6d7e6395516dea43f5"} Jan 10 16:43:50 crc kubenswrapper[5036]: I0110 16:43:50.928104 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fz5f2" event={"ID":"820bd3cc-21aa-4dce-b73f-d2310f0a436a","Type":"ContainerStarted","Data":"293df9e509e10602c96a054c3840be32f8dce598234f8a14ef8607bca4ef88ee"} Jan 10 16:43:50 crc kubenswrapper[5036]: I0110 16:43:50.940828 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" event={"ID":"7789491d-15dc-44e0-88d9-141ba4009010","Type":"ContainerStarted","Data":"43c23df4c28a7f82a7a7e9ceea55f0dc6084cc1ac9e90172ab95e5de6224f1dd"} Jan 10 16:43:50 crc kubenswrapper[5036]: I0110 16:43:50.947542 5036 generic.go:334] "Generic (PLEG): container finished" podID="f8da4ffe-27e8-460a-9639-1da9afda0a2d" containerID="ac9d77b4de35dd29a86f95706d5e670a78af7a78e832a0cf5e3bb89dc4150e3d" exitCode=0 Jan 10 16:43:50 crc kubenswrapper[5036]: I0110 16:43:50.947581 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb4695fc-kdqqh" event={"ID":"f8da4ffe-27e8-460a-9639-1da9afda0a2d","Type":"ContainerDied","Data":"ac9d77b4de35dd29a86f95706d5e670a78af7a78e832a0cf5e3bb89dc4150e3d"} Jan 10 16:43:50 crc kubenswrapper[5036]: I0110 16:43:50.947607 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb4695fc-kdqqh" event={"ID":"f8da4ffe-27e8-460a-9639-1da9afda0a2d","Type":"ContainerStarted","Data":"3b5bda52eab3d6721dd547523565e65848cbb525fff56d1ddd34a45c22ce0fd5"} Jan 10 16:43:50 crc kubenswrapper[5036]: I0110 16:43:50.957533 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-fz5f2" podStartSLOduration=1.957437672 podStartE2EDuration="1.957437672s" podCreationTimestamp="2026-01-10 16:43:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:43:50.948429427 +0000 UTC m=+952.818664921" watchObservedRunningTime="2026-01-10 16:43:50.957437672 +0000 UTC m=+952.827673166" Jan 10 16:43:51 crc kubenswrapper[5036]: I0110 16:43:51.237330 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-kdqqh" Jan 10 16:43:51 crc kubenswrapper[5036]: I0110 16:43:51.353476 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8da4ffe-27e8-460a-9639-1da9afda0a2d-ovsdbserver-nb\") pod \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\" (UID: \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\") " Jan 10 16:43:51 crc kubenswrapper[5036]: I0110 16:43:51.353955 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8da4ffe-27e8-460a-9639-1da9afda0a2d-config\") pod \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\" (UID: \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\") " Jan 10 16:43:51 crc kubenswrapper[5036]: I0110 16:43:51.354011 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8da4ffe-27e8-460a-9639-1da9afda0a2d-ovsdbserver-sb\") pod \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\" (UID: \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\") " Jan 10 16:43:51 crc kubenswrapper[5036]: I0110 16:43:51.354041 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8da4ffe-27e8-460a-9639-1da9afda0a2d-dns-svc\") pod \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\" (UID: \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\") " Jan 10 16:43:51 crc kubenswrapper[5036]: I0110 16:43:51.354066 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2xhn\" (UniqueName: \"kubernetes.io/projected/f8da4ffe-27e8-460a-9639-1da9afda0a2d-kube-api-access-c2xhn\") pod \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\" (UID: \"f8da4ffe-27e8-460a-9639-1da9afda0a2d\") " Jan 10 16:43:51 crc kubenswrapper[5036]: I0110 16:43:51.360845 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8da4ffe-27e8-460a-9639-1da9afda0a2d-kube-api-access-c2xhn" (OuterVolumeSpecName: "kube-api-access-c2xhn") pod "f8da4ffe-27e8-460a-9639-1da9afda0a2d" (UID: "f8da4ffe-27e8-460a-9639-1da9afda0a2d"). InnerVolumeSpecName "kube-api-access-c2xhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:43:51 crc kubenswrapper[5036]: I0110 16:43:51.377937 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8da4ffe-27e8-460a-9639-1da9afda0a2d-config" (OuterVolumeSpecName: "config") pod "f8da4ffe-27e8-460a-9639-1da9afda0a2d" (UID: "f8da4ffe-27e8-460a-9639-1da9afda0a2d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:43:51 crc kubenswrapper[5036]: I0110 16:43:51.377981 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8da4ffe-27e8-460a-9639-1da9afda0a2d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f8da4ffe-27e8-460a-9639-1da9afda0a2d" (UID: "f8da4ffe-27e8-460a-9639-1da9afda0a2d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:43:51 crc kubenswrapper[5036]: I0110 16:43:51.378437 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8da4ffe-27e8-460a-9639-1da9afda0a2d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f8da4ffe-27e8-460a-9639-1da9afda0a2d" (UID: "f8da4ffe-27e8-460a-9639-1da9afda0a2d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:43:51 crc kubenswrapper[5036]: I0110 16:43:51.378760 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8da4ffe-27e8-460a-9639-1da9afda0a2d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f8da4ffe-27e8-460a-9639-1da9afda0a2d" (UID: "f8da4ffe-27e8-460a-9639-1da9afda0a2d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:43:51 crc kubenswrapper[5036]: I0110 16:43:51.456132 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8da4ffe-27e8-460a-9639-1da9afda0a2d-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:51 crc kubenswrapper[5036]: I0110 16:43:51.456167 5036 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f8da4ffe-27e8-460a-9639-1da9afda0a2d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:51 crc kubenswrapper[5036]: I0110 16:43:51.456178 5036 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f8da4ffe-27e8-460a-9639-1da9afda0a2d-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:51 crc kubenswrapper[5036]: I0110 16:43:51.456187 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2xhn\" (UniqueName: \"kubernetes.io/projected/f8da4ffe-27e8-460a-9639-1da9afda0a2d-kube-api-access-c2xhn\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:51 crc kubenswrapper[5036]: I0110 16:43:51.456196 5036 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f8da4ffe-27e8-460a-9639-1da9afda0a2d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:51 crc kubenswrapper[5036]: I0110 16:43:51.959464 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bks65" event={"ID":"b1899d96-c3b2-415c-b1fd-7c2847da4370","Type":"ContainerStarted","Data":"5aa7d4af41e3913267cb29417492d0762ad6c746ab2547a867c7a0c593e6b8a1"} Jan 10 16:43:51 crc kubenswrapper[5036]: I0110 16:43:51.970500 5036 generic.go:334] "Generic (PLEG): container finished" podID="7789491d-15dc-44e0-88d9-141ba4009010" containerID="7419f13515967cfbca2537638e353578285e56e998029aa16bc7dd489d03aee8" exitCode=0 Jan 10 16:43:51 crc kubenswrapper[5036]: I0110 16:43:51.970562 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" event={"ID":"7789491d-15dc-44e0-88d9-141ba4009010","Type":"ContainerDied","Data":"7419f13515967cfbca2537638e353578285e56e998029aa16bc7dd489d03aee8"} Jan 10 16:43:51 crc kubenswrapper[5036]: I0110 16:43:51.982947 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb4695fc-kdqqh" event={"ID":"f8da4ffe-27e8-460a-9639-1da9afda0a2d","Type":"ContainerDied","Data":"3b5bda52eab3d6721dd547523565e65848cbb525fff56d1ddd34a45c22ce0fd5"} Jan 10 16:43:51 crc kubenswrapper[5036]: I0110 16:43:51.982997 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-kdqqh" Jan 10 16:43:51 crc kubenswrapper[5036]: I0110 16:43:51.983008 5036 scope.go:117] "RemoveContainer" containerID="ac9d77b4de35dd29a86f95706d5e670a78af7a78e832a0cf5e3bb89dc4150e3d" Jan 10 16:43:52 crc kubenswrapper[5036]: I0110 16:43:52.020380 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-bks65" podStartSLOduration=3.020357424 podStartE2EDuration="3.020357424s" podCreationTimestamp="2026-01-10 16:43:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:43:51.979101445 +0000 UTC m=+953.849336959" watchObservedRunningTime="2026-01-10 16:43:52.020357424 +0000 UTC m=+953.890592928" Jan 10 16:43:52 crc kubenswrapper[5036]: I0110 16:43:52.180241 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-kdqqh"] Jan 10 16:43:52 crc kubenswrapper[5036]: I0110 16:43:52.187925 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-kdqqh"] Jan 10 16:43:52 crc kubenswrapper[5036]: I0110 16:43:52.521031 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8da4ffe-27e8-460a-9639-1da9afda0a2d" path="/var/lib/kubelet/pods/f8da4ffe-27e8-460a-9639-1da9afda0a2d/volumes" Jan 10 16:43:52 crc kubenswrapper[5036]: I0110 16:43:52.996146 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" event={"ID":"7789491d-15dc-44e0-88d9-141ba4009010","Type":"ContainerStarted","Data":"4c9a4f26c5c3c2fff3f468f9e247fe597705dd13b66d99a167ea534c903e25c4"} Jan 10 16:43:53 crc kubenswrapper[5036]: I0110 16:43:53.065497 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" podStartSLOduration=4.065474722 podStartE2EDuration="4.065474722s" podCreationTimestamp="2026-01-10 16:43:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:43:53.049769647 +0000 UTC m=+954.920005161" watchObservedRunningTime="2026-01-10 16:43:53.065474722 +0000 UTC m=+954.935710256" Jan 10 16:43:54 crc kubenswrapper[5036]: I0110 16:43:54.076843 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" Jan 10 16:43:56 crc kubenswrapper[5036]: I0110 16:43:56.109861 5036 generic.go:334] "Generic (PLEG): container finished" podID="820bd3cc-21aa-4dce-b73f-d2310f0a436a" containerID="9b50602cb81b4a10f23f70316e09fc6975bf9fbf04c93e6d7e6395516dea43f5" exitCode=0 Jan 10 16:43:56 crc kubenswrapper[5036]: I0110 16:43:56.109946 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fz5f2" event={"ID":"820bd3cc-21aa-4dce-b73f-d2310f0a436a","Type":"ContainerDied","Data":"9b50602cb81b4a10f23f70316e09fc6975bf9fbf04c93e6d7e6395516dea43f5"} Jan 10 16:43:57 crc kubenswrapper[5036]: I0110 16:43:57.954149 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fz5f2" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.032180 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-fernet-keys\") pod \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\" (UID: \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\") " Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.032221 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-combined-ca-bundle\") pod \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\" (UID: \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\") " Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.032305 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-credential-keys\") pod \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\" (UID: \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\") " Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.032327 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-scripts\") pod \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\" (UID: \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\") " Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.032373 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-config-data\") pod \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\" (UID: \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\") " Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.032402 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9pvn\" (UniqueName: \"kubernetes.io/projected/820bd3cc-21aa-4dce-b73f-d2310f0a436a-kube-api-access-j9pvn\") pod \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\" (UID: \"820bd3cc-21aa-4dce-b73f-d2310f0a436a\") " Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.038279 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/820bd3cc-21aa-4dce-b73f-d2310f0a436a-kube-api-access-j9pvn" (OuterVolumeSpecName: "kube-api-access-j9pvn") pod "820bd3cc-21aa-4dce-b73f-d2310f0a436a" (UID: "820bd3cc-21aa-4dce-b73f-d2310f0a436a"). InnerVolumeSpecName "kube-api-access-j9pvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.038403 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "820bd3cc-21aa-4dce-b73f-d2310f0a436a" (UID: "820bd3cc-21aa-4dce-b73f-d2310f0a436a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.039457 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-scripts" (OuterVolumeSpecName: "scripts") pod "820bd3cc-21aa-4dce-b73f-d2310f0a436a" (UID: "820bd3cc-21aa-4dce-b73f-d2310f0a436a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.039555 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "820bd3cc-21aa-4dce-b73f-d2310f0a436a" (UID: "820bd3cc-21aa-4dce-b73f-d2310f0a436a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.058445 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-config-data" (OuterVolumeSpecName: "config-data") pod "820bd3cc-21aa-4dce-b73f-d2310f0a436a" (UID: "820bd3cc-21aa-4dce-b73f-d2310f0a436a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.058552 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "820bd3cc-21aa-4dce-b73f-d2310f0a436a" (UID: "820bd3cc-21aa-4dce-b73f-d2310f0a436a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.133386 5036 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.133414 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.133424 5036 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.133432 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.133440 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820bd3cc-21aa-4dce-b73f-d2310f0a436a-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.133448 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9pvn\" (UniqueName: \"kubernetes.io/projected/820bd3cc-21aa-4dce-b73f-d2310f0a436a-kube-api-access-j9pvn\") on node \"crc\" DevicePath \"\"" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.194440 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-fz5f2"] Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.195964 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fz5f2" event={"ID":"820bd3cc-21aa-4dce-b73f-d2310f0a436a","Type":"ContainerDied","Data":"293df9e509e10602c96a054c3840be32f8dce598234f8a14ef8607bca4ef88ee"} Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.195997 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="293df9e509e10602c96a054c3840be32f8dce598234f8a14ef8607bca4ef88ee" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.196006 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fz5f2" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.199176 5036 generic.go:334] "Generic (PLEG): container finished" podID="09a8e315-dd60-47a9-b03c-0897b6f21b3d" containerID="8d47ab53ff73ece2438b5a690dec63225c2d89b69b409be461593a11cd0e4a87" exitCode=0 Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.199208 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lj72s" event={"ID":"09a8e315-dd60-47a9-b03c-0897b6f21b3d","Type":"ContainerDied","Data":"8d47ab53ff73ece2438b5a690dec63225c2d89b69b409be461593a11cd0e4a87"} Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.203652 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-fz5f2"] Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.300757 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-cngfk"] Jan 10 16:43:58 crc kubenswrapper[5036]: E0110 16:43:58.301322 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8da4ffe-27e8-460a-9639-1da9afda0a2d" containerName="init" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.301343 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8da4ffe-27e8-460a-9639-1da9afda0a2d" containerName="init" Jan 10 16:43:58 crc kubenswrapper[5036]: E0110 16:43:58.301365 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820bd3cc-21aa-4dce-b73f-d2310f0a436a" containerName="keystone-bootstrap" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.301378 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="820bd3cc-21aa-4dce-b73f-d2310f0a436a" containerName="keystone-bootstrap" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.301663 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="820bd3cc-21aa-4dce-b73f-d2310f0a436a" containerName="keystone-bootstrap" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.301782 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8da4ffe-27e8-460a-9639-1da9afda0a2d" containerName="init" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.302633 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cngfk" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.305912 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-n6pzn" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.306168 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.306706 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.308360 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.308413 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-cngfk"] Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.309536 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.336765 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-combined-ca-bundle\") pod \"keystone-bootstrap-cngfk\" (UID: \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\") " pod="openstack/keystone-bootstrap-cngfk" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.336853 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-scripts\") pod \"keystone-bootstrap-cngfk\" (UID: \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\") " pod="openstack/keystone-bootstrap-cngfk" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.336883 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-credential-keys\") pod \"keystone-bootstrap-cngfk\" (UID: \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\") " pod="openstack/keystone-bootstrap-cngfk" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.336945 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-config-data\") pod \"keystone-bootstrap-cngfk\" (UID: \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\") " pod="openstack/keystone-bootstrap-cngfk" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.337013 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-fernet-keys\") pod \"keystone-bootstrap-cngfk\" (UID: \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\") " pod="openstack/keystone-bootstrap-cngfk" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.337074 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl5qc\" (UniqueName: \"kubernetes.io/projected/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-kube-api-access-bl5qc\") pod \"keystone-bootstrap-cngfk\" (UID: \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\") " pod="openstack/keystone-bootstrap-cngfk" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.437760 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-fernet-keys\") pod \"keystone-bootstrap-cngfk\" (UID: \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\") " pod="openstack/keystone-bootstrap-cngfk" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.438000 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl5qc\" (UniqueName: \"kubernetes.io/projected/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-kube-api-access-bl5qc\") pod \"keystone-bootstrap-cngfk\" (UID: \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\") " pod="openstack/keystone-bootstrap-cngfk" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.438054 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-combined-ca-bundle\") pod \"keystone-bootstrap-cngfk\" (UID: \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\") " pod="openstack/keystone-bootstrap-cngfk" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.438105 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-scripts\") pod \"keystone-bootstrap-cngfk\" (UID: \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\") " pod="openstack/keystone-bootstrap-cngfk" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.438131 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-credential-keys\") pod \"keystone-bootstrap-cngfk\" (UID: \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\") " pod="openstack/keystone-bootstrap-cngfk" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.438156 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-config-data\") pod \"keystone-bootstrap-cngfk\" (UID: \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\") " pod="openstack/keystone-bootstrap-cngfk" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.441861 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.441888 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.442379 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.446763 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-combined-ca-bundle\") pod \"keystone-bootstrap-cngfk\" (UID: \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\") " pod="openstack/keystone-bootstrap-cngfk" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.451314 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-fernet-keys\") pod \"keystone-bootstrap-cngfk\" (UID: \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\") " pod="openstack/keystone-bootstrap-cngfk" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.459994 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-credential-keys\") pod \"keystone-bootstrap-cngfk\" (UID: \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\") " pod="openstack/keystone-bootstrap-cngfk" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.470807 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-scripts\") pod \"keystone-bootstrap-cngfk\" (UID: \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\") " pod="openstack/keystone-bootstrap-cngfk" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.471248 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-config-data\") pod \"keystone-bootstrap-cngfk\" (UID: \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\") " pod="openstack/keystone-bootstrap-cngfk" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.478336 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl5qc\" (UniqueName: \"kubernetes.io/projected/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-kube-api-access-bl5qc\") pod \"keystone-bootstrap-cngfk\" (UID: \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\") " pod="openstack/keystone-bootstrap-cngfk" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.525709 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="820bd3cc-21aa-4dce-b73f-d2310f0a436a" path="/var/lib/kubelet/pods/820bd3cc-21aa-4dce-b73f-d2310f0a436a/volumes" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.626972 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-n6pzn" Jan 10 16:43:58 crc kubenswrapper[5036]: I0110 16:43:58.635723 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cngfk" Jan 10 16:43:59 crc kubenswrapper[5036]: I0110 16:43:59.986197 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" Jan 10 16:44:00 crc kubenswrapper[5036]: I0110 16:44:00.045788 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7p28q"] Jan 10 16:44:00 crc kubenswrapper[5036]: I0110 16:44:00.046037 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" podUID="ba71d9fa-3872-4975-8f46-767f96064411" containerName="dnsmasq-dns" containerID="cri-o://cc34ed8a49035e5d47d4f6cddd767ffa78c14186e029a057944a0517ff7bd9d7" gracePeriod=10 Jan 10 16:44:01 crc kubenswrapper[5036]: I0110 16:44:01.276444 5036 generic.go:334] "Generic (PLEG): container finished" podID="ba71d9fa-3872-4975-8f46-767f96064411" containerID="cc34ed8a49035e5d47d4f6cddd767ffa78c14186e029a057944a0517ff7bd9d7" exitCode=0 Jan 10 16:44:01 crc kubenswrapper[5036]: I0110 16:44:01.276530 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" event={"ID":"ba71d9fa-3872-4975-8f46-767f96064411","Type":"ContainerDied","Data":"cc34ed8a49035e5d47d4f6cddd767ffa78c14186e029a057944a0517ff7bd9d7"} Jan 10 16:44:02 crc kubenswrapper[5036]: I0110 16:44:02.466699 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lj72s" Jan 10 16:44:02 crc kubenswrapper[5036]: I0110 16:44:02.645545 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/09a8e315-dd60-47a9-b03c-0897b6f21b3d-db-sync-config-data\") pod \"09a8e315-dd60-47a9-b03c-0897b6f21b3d\" (UID: \"09a8e315-dd60-47a9-b03c-0897b6f21b3d\") " Jan 10 16:44:02 crc kubenswrapper[5036]: I0110 16:44:02.645637 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a8e315-dd60-47a9-b03c-0897b6f21b3d-combined-ca-bundle\") pod \"09a8e315-dd60-47a9-b03c-0897b6f21b3d\" (UID: \"09a8e315-dd60-47a9-b03c-0897b6f21b3d\") " Jan 10 16:44:02 crc kubenswrapper[5036]: I0110 16:44:02.645680 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a8e315-dd60-47a9-b03c-0897b6f21b3d-config-data\") pod \"09a8e315-dd60-47a9-b03c-0897b6f21b3d\" (UID: \"09a8e315-dd60-47a9-b03c-0897b6f21b3d\") " Jan 10 16:44:02 crc kubenswrapper[5036]: I0110 16:44:02.645879 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsg5n\" (UniqueName: \"kubernetes.io/projected/09a8e315-dd60-47a9-b03c-0897b6f21b3d-kube-api-access-qsg5n\") pod \"09a8e315-dd60-47a9-b03c-0897b6f21b3d\" (UID: \"09a8e315-dd60-47a9-b03c-0897b6f21b3d\") " Jan 10 16:44:02 crc kubenswrapper[5036]: I0110 16:44:02.652720 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09a8e315-dd60-47a9-b03c-0897b6f21b3d-kube-api-access-qsg5n" (OuterVolumeSpecName: "kube-api-access-qsg5n") pod "09a8e315-dd60-47a9-b03c-0897b6f21b3d" (UID: "09a8e315-dd60-47a9-b03c-0897b6f21b3d"). InnerVolumeSpecName "kube-api-access-qsg5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:44:02 crc kubenswrapper[5036]: I0110 16:44:02.674371 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09a8e315-dd60-47a9-b03c-0897b6f21b3d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "09a8e315-dd60-47a9-b03c-0897b6f21b3d" (UID: "09a8e315-dd60-47a9-b03c-0897b6f21b3d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:02 crc kubenswrapper[5036]: I0110 16:44:02.675887 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09a8e315-dd60-47a9-b03c-0897b6f21b3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09a8e315-dd60-47a9-b03c-0897b6f21b3d" (UID: "09a8e315-dd60-47a9-b03c-0897b6f21b3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:02 crc kubenswrapper[5036]: I0110 16:44:02.696832 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09a8e315-dd60-47a9-b03c-0897b6f21b3d-config-data" (OuterVolumeSpecName: "config-data") pod "09a8e315-dd60-47a9-b03c-0897b6f21b3d" (UID: "09a8e315-dd60-47a9-b03c-0897b6f21b3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:02 crc kubenswrapper[5036]: I0110 16:44:02.748716 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsg5n\" (UniqueName: \"kubernetes.io/projected/09a8e315-dd60-47a9-b03c-0897b6f21b3d-kube-api-access-qsg5n\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:02 crc kubenswrapper[5036]: I0110 16:44:02.748766 5036 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/09a8e315-dd60-47a9-b03c-0897b6f21b3d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:02 crc kubenswrapper[5036]: I0110 16:44:02.748779 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a8e315-dd60-47a9-b03c-0897b6f21b3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:02 crc kubenswrapper[5036]: I0110 16:44:02.748789 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a8e315-dd60-47a9-b03c-0897b6f21b3d-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:03 crc kubenswrapper[5036]: I0110 16:44:03.299923 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lj72s" event={"ID":"09a8e315-dd60-47a9-b03c-0897b6f21b3d","Type":"ContainerDied","Data":"020c7634d681520529bea2eb7d907b5d0f5b6dc20ad78d6402d0a80072b7c3f6"} Jan 10 16:44:03 crc kubenswrapper[5036]: I0110 16:44:03.300294 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="020c7634d681520529bea2eb7d907b5d0f5b6dc20ad78d6402d0a80072b7c3f6" Jan 10 16:44:03 crc kubenswrapper[5036]: I0110 16:44:03.299938 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lj72s" Jan 10 16:44:03 crc kubenswrapper[5036]: I0110 16:44:03.896161 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-zlbkl"] Jan 10 16:44:03 crc kubenswrapper[5036]: E0110 16:44:03.896488 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09a8e315-dd60-47a9-b03c-0897b6f21b3d" containerName="glance-db-sync" Jan 10 16:44:03 crc kubenswrapper[5036]: I0110 16:44:03.896500 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a8e315-dd60-47a9-b03c-0897b6f21b3d" containerName="glance-db-sync" Jan 10 16:44:03 crc kubenswrapper[5036]: I0110 16:44:03.896655 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="09a8e315-dd60-47a9-b03c-0897b6f21b3d" containerName="glance-db-sync" Jan 10 16:44:03 crc kubenswrapper[5036]: I0110 16:44:03.898306 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" Jan 10 16:44:03 crc kubenswrapper[5036]: I0110 16:44:03.923489 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-zlbkl"] Jan 10 16:44:04 crc kubenswrapper[5036]: I0110 16:44:04.068959 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7e26dab-ee9d-414e-8447-738d9f153999-config\") pod \"dnsmasq-dns-7987f74bbc-zlbkl\" (UID: \"f7e26dab-ee9d-414e-8447-738d9f153999\") " pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" Jan 10 16:44:04 crc kubenswrapper[5036]: I0110 16:44:04.068999 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvxdj\" (UniqueName: \"kubernetes.io/projected/f7e26dab-ee9d-414e-8447-738d9f153999-kube-api-access-vvxdj\") pod \"dnsmasq-dns-7987f74bbc-zlbkl\" (UID: \"f7e26dab-ee9d-414e-8447-738d9f153999\") " pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" Jan 10 16:44:04 crc kubenswrapper[5036]: I0110 16:44:04.069244 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7e26dab-ee9d-414e-8447-738d9f153999-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-zlbkl\" (UID: \"f7e26dab-ee9d-414e-8447-738d9f153999\") " pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" Jan 10 16:44:04 crc kubenswrapper[5036]: I0110 16:44:04.069286 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7e26dab-ee9d-414e-8447-738d9f153999-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-zlbkl\" (UID: \"f7e26dab-ee9d-414e-8447-738d9f153999\") " pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" Jan 10 16:44:04 crc kubenswrapper[5036]: I0110 16:44:04.069323 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7e26dab-ee9d-414e-8447-738d9f153999-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-zlbkl\" (UID: \"f7e26dab-ee9d-414e-8447-738d9f153999\") " pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" Jan 10 16:44:04 crc kubenswrapper[5036]: I0110 16:44:04.171123 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7e26dab-ee9d-414e-8447-738d9f153999-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-zlbkl\" (UID: \"f7e26dab-ee9d-414e-8447-738d9f153999\") " pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" Jan 10 16:44:04 crc kubenswrapper[5036]: I0110 16:44:04.171739 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7e26dab-ee9d-414e-8447-738d9f153999-config\") pod \"dnsmasq-dns-7987f74bbc-zlbkl\" (UID: \"f7e26dab-ee9d-414e-8447-738d9f153999\") " pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" Jan 10 16:44:04 crc kubenswrapper[5036]: I0110 16:44:04.171772 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvxdj\" (UniqueName: \"kubernetes.io/projected/f7e26dab-ee9d-414e-8447-738d9f153999-kube-api-access-vvxdj\") pod \"dnsmasq-dns-7987f74bbc-zlbkl\" (UID: \"f7e26dab-ee9d-414e-8447-738d9f153999\") " pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" Jan 10 16:44:04 crc kubenswrapper[5036]: I0110 16:44:04.171863 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7e26dab-ee9d-414e-8447-738d9f153999-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-zlbkl\" (UID: \"f7e26dab-ee9d-414e-8447-738d9f153999\") " pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" Jan 10 16:44:04 crc kubenswrapper[5036]: I0110 16:44:04.171894 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7e26dab-ee9d-414e-8447-738d9f153999-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-zlbkl\" (UID: \"f7e26dab-ee9d-414e-8447-738d9f153999\") " pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" Jan 10 16:44:04 crc kubenswrapper[5036]: I0110 16:44:04.172700 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7e26dab-ee9d-414e-8447-738d9f153999-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-zlbkl\" (UID: \"f7e26dab-ee9d-414e-8447-738d9f153999\") " pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" Jan 10 16:44:04 crc kubenswrapper[5036]: I0110 16:44:04.172718 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7e26dab-ee9d-414e-8447-738d9f153999-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-zlbkl\" (UID: \"f7e26dab-ee9d-414e-8447-738d9f153999\") " pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" Jan 10 16:44:04 crc kubenswrapper[5036]: I0110 16:44:04.172764 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7e26dab-ee9d-414e-8447-738d9f153999-config\") pod \"dnsmasq-dns-7987f74bbc-zlbkl\" (UID: \"f7e26dab-ee9d-414e-8447-738d9f153999\") " pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" Jan 10 16:44:04 crc kubenswrapper[5036]: I0110 16:44:04.173146 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7e26dab-ee9d-414e-8447-738d9f153999-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-zlbkl\" (UID: \"f7e26dab-ee9d-414e-8447-738d9f153999\") " pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" Jan 10 16:44:04 crc kubenswrapper[5036]: I0110 16:44:04.196695 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvxdj\" (UniqueName: \"kubernetes.io/projected/f7e26dab-ee9d-414e-8447-738d9f153999-kube-api-access-vvxdj\") pod \"dnsmasq-dns-7987f74bbc-zlbkl\" (UID: \"f7e26dab-ee9d-414e-8447-738d9f153999\") " pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" Jan 10 16:44:04 crc kubenswrapper[5036]: I0110 16:44:04.219115 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" Jan 10 16:44:08 crc kubenswrapper[5036]: I0110 16:44:08.670929 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" podUID="ba71d9fa-3872-4975-8f46-767f96064411" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: i/o timeout" Jan 10 16:44:13 crc kubenswrapper[5036]: I0110 16:44:13.671629 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" podUID="ba71d9fa-3872-4975-8f46-767f96064411" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: i/o timeout" Jan 10 16:44:17 crc kubenswrapper[5036]: E0110 16:44:17.209012 5036 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Jan 10 16:44:17 crc kubenswrapper[5036]: E0110 16:44:17.209783 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nf9h67ch5d7hfh567h8ch55bh76h7chbh56bh677h5f9h58ch554h57fhdchfbh7ch97h689h65fhc4h5ch59fh5d5hcch5b5h545h5c7h5f6h66q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mfdn5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(acd18657-f02e-4b2f-8ec6-e46b2408e720): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 10 16:44:17 crc kubenswrapper[5036]: I0110 16:44:17.294703 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" Jan 10 16:44:17 crc kubenswrapper[5036]: I0110 16:44:17.439912 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba71d9fa-3872-4975-8f46-767f96064411-dns-svc\") pod \"ba71d9fa-3872-4975-8f46-767f96064411\" (UID: \"ba71d9fa-3872-4975-8f46-767f96064411\") " Jan 10 16:44:17 crc kubenswrapper[5036]: I0110 16:44:17.440021 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba71d9fa-3872-4975-8f46-767f96064411-ovsdbserver-sb\") pod \"ba71d9fa-3872-4975-8f46-767f96064411\" (UID: \"ba71d9fa-3872-4975-8f46-767f96064411\") " Jan 10 16:44:17 crc kubenswrapper[5036]: I0110 16:44:17.440072 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba71d9fa-3872-4975-8f46-767f96064411-config\") pod \"ba71d9fa-3872-4975-8f46-767f96064411\" (UID: \"ba71d9fa-3872-4975-8f46-767f96064411\") " Jan 10 16:44:17 crc kubenswrapper[5036]: I0110 16:44:17.440111 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba71d9fa-3872-4975-8f46-767f96064411-ovsdbserver-nb\") pod \"ba71d9fa-3872-4975-8f46-767f96064411\" (UID: \"ba71d9fa-3872-4975-8f46-767f96064411\") " Jan 10 16:44:17 crc kubenswrapper[5036]: I0110 16:44:17.440170 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cs7f\" (UniqueName: \"kubernetes.io/projected/ba71d9fa-3872-4975-8f46-767f96064411-kube-api-access-8cs7f\") pod \"ba71d9fa-3872-4975-8f46-767f96064411\" (UID: \"ba71d9fa-3872-4975-8f46-767f96064411\") " Jan 10 16:44:17 crc kubenswrapper[5036]: I0110 16:44:17.443892 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba71d9fa-3872-4975-8f46-767f96064411-kube-api-access-8cs7f" (OuterVolumeSpecName: "kube-api-access-8cs7f") pod "ba71d9fa-3872-4975-8f46-767f96064411" (UID: "ba71d9fa-3872-4975-8f46-767f96064411"). InnerVolumeSpecName "kube-api-access-8cs7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:44:17 crc kubenswrapper[5036]: I0110 16:44:17.448578 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" event={"ID":"ba71d9fa-3872-4975-8f46-767f96064411","Type":"ContainerDied","Data":"b49b34eb0f028847e2974dba9a1c8267fa741b7e10cc0b972bbf9337017079bd"} Jan 10 16:44:17 crc kubenswrapper[5036]: I0110 16:44:17.448776 5036 scope.go:117] "RemoveContainer" containerID="cc34ed8a49035e5d47d4f6cddd767ffa78c14186e029a057944a0517ff7bd9d7" Jan 10 16:44:17 crc kubenswrapper[5036]: I0110 16:44:17.448978 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" Jan 10 16:44:17 crc kubenswrapper[5036]: I0110 16:44:17.480852 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba71d9fa-3872-4975-8f46-767f96064411-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ba71d9fa-3872-4975-8f46-767f96064411" (UID: "ba71d9fa-3872-4975-8f46-767f96064411"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:44:17 crc kubenswrapper[5036]: I0110 16:44:17.484814 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba71d9fa-3872-4975-8f46-767f96064411-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ba71d9fa-3872-4975-8f46-767f96064411" (UID: "ba71d9fa-3872-4975-8f46-767f96064411"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:44:17 crc kubenswrapper[5036]: I0110 16:44:17.485521 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba71d9fa-3872-4975-8f46-767f96064411-config" (OuterVolumeSpecName: "config") pod "ba71d9fa-3872-4975-8f46-767f96064411" (UID: "ba71d9fa-3872-4975-8f46-767f96064411"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:44:17 crc kubenswrapper[5036]: I0110 16:44:17.495449 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba71d9fa-3872-4975-8f46-767f96064411-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ba71d9fa-3872-4975-8f46-767f96064411" (UID: "ba71d9fa-3872-4975-8f46-767f96064411"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:44:17 crc kubenswrapper[5036]: I0110 16:44:17.543005 5036 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba71d9fa-3872-4975-8f46-767f96064411-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:17 crc kubenswrapper[5036]: I0110 16:44:17.543385 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba71d9fa-3872-4975-8f46-767f96064411-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:17 crc kubenswrapper[5036]: I0110 16:44:17.543397 5036 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba71d9fa-3872-4975-8f46-767f96064411-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:17 crc kubenswrapper[5036]: I0110 16:44:17.543409 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cs7f\" (UniqueName: \"kubernetes.io/projected/ba71d9fa-3872-4975-8f46-767f96064411-kube-api-access-8cs7f\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:17 crc kubenswrapper[5036]: I0110 16:44:17.543421 5036 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba71d9fa-3872-4975-8f46-767f96064411-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:17 crc kubenswrapper[5036]: I0110 16:44:17.786794 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7p28q"] Jan 10 16:44:17 crc kubenswrapper[5036]: I0110 16:44:17.798370 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7p28q"] Jan 10 16:44:18 crc kubenswrapper[5036]: I0110 16:44:18.320497 5036 scope.go:117] "RemoveContainer" containerID="f47fc35c910e13afe46c3fd40743e39da0247e4dca1ad5fb4fce4ac8a4c9e339" Jan 10 16:44:18 crc kubenswrapper[5036]: E0110 16:44:18.334782 5036 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Jan 10 16:44:18 crc kubenswrapper[5036]: E0110 16:44:18.334983 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5lt62,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-j9crs_openstack(6fe6dd46-603d-4595-ad27-32f98623fbcc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 10 16:44:18 crc kubenswrapper[5036]: E0110 16:44:18.336372 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-j9crs" podUID="6fe6dd46-603d-4595-ad27-32f98623fbcc" Jan 10 16:44:18 crc kubenswrapper[5036]: E0110 16:44:18.493990 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-j9crs" podUID="6fe6dd46-603d-4595-ad27-32f98623fbcc" Jan 10 16:44:18 crc kubenswrapper[5036]: I0110 16:44:18.522776 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba71d9fa-3872-4975-8f46-767f96064411" path="/var/lib/kubelet/pods/ba71d9fa-3872-4975-8f46-767f96064411/volumes" Jan 10 16:44:18 crc kubenswrapper[5036]: I0110 16:44:18.672787 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-7p28q" podUID="ba71d9fa-3872-4975-8f46-767f96064411" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: i/o timeout" Jan 10 16:44:18 crc kubenswrapper[5036]: I0110 16:44:18.756748 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-zlbkl"] Jan 10 16:44:18 crc kubenswrapper[5036]: I0110 16:44:18.858133 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-cngfk"] Jan 10 16:44:18 crc kubenswrapper[5036]: W0110 16:44:18.975134 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7e26dab_ee9d_414e_8447_738d9f153999.slice/crio-63f4a5f0d91cbdac0698dc3d3f123f7f1af3963b52a1a73143ec940dd2e8f48b WatchSource:0}: Error finding container 63f4a5f0d91cbdac0698dc3d3f123f7f1af3963b52a1a73143ec940dd2e8f48b: Status 404 returned error can't find the container with id 63f4a5f0d91cbdac0698dc3d3f123f7f1af3963b52a1a73143ec940dd2e8f48b Jan 10 16:44:18 crc kubenswrapper[5036]: W0110 16:44:18.975453 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cec5a6e_1a0d_45e9_a4f3_5e8aedc3d402.slice/crio-683c830bf043b821349a486826fdb12d8d397416419b843579341ddaedf209cf WatchSource:0}: Error finding container 683c830bf043b821349a486826fdb12d8d397416419b843579341ddaedf209cf: Status 404 returned error can't find the container with id 683c830bf043b821349a486826fdb12d8d397416419b843579341ddaedf209cf Jan 10 16:44:18 crc kubenswrapper[5036]: I0110 16:44:18.980519 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 10 16:44:19 crc kubenswrapper[5036]: I0110 16:44:19.501389 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wwspw" event={"ID":"7b4df096-e6ee-47df-a4ae-035aeade27a6","Type":"ContainerStarted","Data":"1b6fe53b54157c6721584141cb346327060b09cff0ddbcf42cf33cb176320ec8"} Jan 10 16:44:19 crc kubenswrapper[5036]: I0110 16:44:19.540229 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-wwspw" podStartSLOduration=4.216843796 podStartE2EDuration="30.540205127s" podCreationTimestamp="2026-01-10 16:43:49 +0000 UTC" firstStartedPulling="2026-01-10 16:43:50.880458942 +0000 UTC m=+952.750694436" lastFinishedPulling="2026-01-10 16:44:17.203820273 +0000 UTC m=+979.074055767" observedRunningTime="2026-01-10 16:44:19.534762342 +0000 UTC m=+981.404997856" watchObservedRunningTime="2026-01-10 16:44:19.540205127 +0000 UTC m=+981.410440621" Jan 10 16:44:19 crc kubenswrapper[5036]: I0110 16:44:19.549113 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-c9rbf" event={"ID":"0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c","Type":"ContainerStarted","Data":"ee1ff43f87aafe6e6053d10753108168c66f458552bafa514e0c8a418b77ed78"} Jan 10 16:44:19 crc kubenswrapper[5036]: I0110 16:44:19.559035 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cngfk" event={"ID":"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402","Type":"ContainerStarted","Data":"ad1572e307019edae219440a682d075861e0bb46a1d5f9b75a8dd28efff7b578"} Jan 10 16:44:19 crc kubenswrapper[5036]: I0110 16:44:19.559094 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cngfk" event={"ID":"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402","Type":"ContainerStarted","Data":"683c830bf043b821349a486826fdb12d8d397416419b843579341ddaedf209cf"} Jan 10 16:44:19 crc kubenswrapper[5036]: I0110 16:44:19.563993 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-c9rbf" podStartSLOduration=3.096762055 podStartE2EDuration="30.563977542s" podCreationTimestamp="2026-01-10 16:43:49 +0000 UTC" firstStartedPulling="2026-01-10 16:43:50.81122094 +0000 UTC m=+952.681456434" lastFinishedPulling="2026-01-10 16:44:18.278436427 +0000 UTC m=+980.148671921" observedRunningTime="2026-01-10 16:44:19.562409517 +0000 UTC m=+981.432645011" watchObservedRunningTime="2026-01-10 16:44:19.563977542 +0000 UTC m=+981.434213046" Jan 10 16:44:19 crc kubenswrapper[5036]: I0110 16:44:19.566739 5036 generic.go:334] "Generic (PLEG): container finished" podID="f7e26dab-ee9d-414e-8447-738d9f153999" containerID="20d04bde7019142ba65661ac9bede99ff050f32518594f604036d4858f635e69" exitCode=0 Jan 10 16:44:19 crc kubenswrapper[5036]: I0110 16:44:19.566804 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" event={"ID":"f7e26dab-ee9d-414e-8447-738d9f153999","Type":"ContainerDied","Data":"20d04bde7019142ba65661ac9bede99ff050f32518594f604036d4858f635e69"} Jan 10 16:44:19 crc kubenswrapper[5036]: I0110 16:44:19.566836 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" event={"ID":"f7e26dab-ee9d-414e-8447-738d9f153999","Type":"ContainerStarted","Data":"63f4a5f0d91cbdac0698dc3d3f123f7f1af3963b52a1a73143ec940dd2e8f48b"} Jan 10 16:44:19 crc kubenswrapper[5036]: I0110 16:44:19.570752 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acd18657-f02e-4b2f-8ec6-e46b2408e720","Type":"ContainerStarted","Data":"70616488c1da8e0d8e5fd028915cba1cbe92da1ad3858c776c7fa631fb9855dc"} Jan 10 16:44:19 crc kubenswrapper[5036]: I0110 16:44:19.589018 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-cngfk" podStartSLOduration=21.588997473 podStartE2EDuration="21.588997473s" podCreationTimestamp="2026-01-10 16:43:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:44:19.579124972 +0000 UTC m=+981.449360466" watchObservedRunningTime="2026-01-10 16:44:19.588997473 +0000 UTC m=+981.459232967" Jan 10 16:44:20 crc kubenswrapper[5036]: I0110 16:44:20.582130 5036 generic.go:334] "Generic (PLEG): container finished" podID="b1899d96-c3b2-415c-b1fd-7c2847da4370" containerID="5aa7d4af41e3913267cb29417492d0762ad6c746ab2547a867c7a0c593e6b8a1" exitCode=0 Jan 10 16:44:20 crc kubenswrapper[5036]: I0110 16:44:20.582221 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bks65" event={"ID":"b1899d96-c3b2-415c-b1fd-7c2847da4370","Type":"ContainerDied","Data":"5aa7d4af41e3913267cb29417492d0762ad6c746ab2547a867c7a0c593e6b8a1"} Jan 10 16:44:20 crc kubenswrapper[5036]: I0110 16:44:20.586045 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" event={"ID":"f7e26dab-ee9d-414e-8447-738d9f153999","Type":"ContainerStarted","Data":"eb5d89ff3b5a81f600b0f17c4b84f5c3af033a32a2b613964f1c3bc32fe29fdb"} Jan 10 16:44:20 crc kubenswrapper[5036]: I0110 16:44:20.625548 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" podStartSLOduration=17.625529253 podStartE2EDuration="17.625529253s" podCreationTimestamp="2026-01-10 16:44:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:44:20.625302457 +0000 UTC m=+982.495537951" watchObservedRunningTime="2026-01-10 16:44:20.625529253 +0000 UTC m=+982.495764747" Jan 10 16:44:21 crc kubenswrapper[5036]: I0110 16:44:21.600005 5036 generic.go:334] "Generic (PLEG): container finished" podID="0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c" containerID="ee1ff43f87aafe6e6053d10753108168c66f458552bafa514e0c8a418b77ed78" exitCode=0 Jan 10 16:44:21 crc kubenswrapper[5036]: I0110 16:44:21.600104 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-c9rbf" event={"ID":"0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c","Type":"ContainerDied","Data":"ee1ff43f87aafe6e6053d10753108168c66f458552bafa514e0c8a418b77ed78"} Jan 10 16:44:21 crc kubenswrapper[5036]: I0110 16:44:21.602801 5036 generic.go:334] "Generic (PLEG): container finished" podID="7b4df096-e6ee-47df-a4ae-035aeade27a6" containerID="1b6fe53b54157c6721584141cb346327060b09cff0ddbcf42cf33cb176320ec8" exitCode=0 Jan 10 16:44:21 crc kubenswrapper[5036]: I0110 16:44:21.602975 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wwspw" event={"ID":"7b4df096-e6ee-47df-a4ae-035aeade27a6","Type":"ContainerDied","Data":"1b6fe53b54157c6721584141cb346327060b09cff0ddbcf42cf33cb176320ec8"} Jan 10 16:44:21 crc kubenswrapper[5036]: I0110 16:44:21.603891 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" Jan 10 16:44:22 crc kubenswrapper[5036]: I0110 16:44:22.616110 5036 generic.go:334] "Generic (PLEG): container finished" podID="2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402" containerID="ad1572e307019edae219440a682d075861e0bb46a1d5f9b75a8dd28efff7b578" exitCode=0 Jan 10 16:44:22 crc kubenswrapper[5036]: I0110 16:44:22.616223 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cngfk" event={"ID":"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402","Type":"ContainerDied","Data":"ad1572e307019edae219440a682d075861e0bb46a1d5f9b75a8dd28efff7b578"} Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.446660 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bks65" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.472399 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-c9rbf" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.479384 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wwspw" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.553708 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c-db-sync-config-data\") pod \"0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c\" (UID: \"0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c\") " Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.553751 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvndh\" (UniqueName: \"kubernetes.io/projected/b1899d96-c3b2-415c-b1fd-7c2847da4370-kube-api-access-zvndh\") pod \"b1899d96-c3b2-415c-b1fd-7c2847da4370\" (UID: \"b1899d96-c3b2-415c-b1fd-7c2847da4370\") " Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.553789 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jksk\" (UniqueName: \"kubernetes.io/projected/0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c-kube-api-access-4jksk\") pod \"0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c\" (UID: \"0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c\") " Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.553851 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b1899d96-c3b2-415c-b1fd-7c2847da4370-config\") pod \"b1899d96-c3b2-415c-b1fd-7c2847da4370\" (UID: \"b1899d96-c3b2-415c-b1fd-7c2847da4370\") " Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.553887 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b4df096-e6ee-47df-a4ae-035aeade27a6-scripts\") pod \"7b4df096-e6ee-47df-a4ae-035aeade27a6\" (UID: \"7b4df096-e6ee-47df-a4ae-035aeade27a6\") " Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.553934 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b4df096-e6ee-47df-a4ae-035aeade27a6-logs\") pod \"7b4df096-e6ee-47df-a4ae-035aeade27a6\" (UID: \"7b4df096-e6ee-47df-a4ae-035aeade27a6\") " Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.553964 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qw8gn\" (UniqueName: \"kubernetes.io/projected/7b4df096-e6ee-47df-a4ae-035aeade27a6-kube-api-access-qw8gn\") pod \"7b4df096-e6ee-47df-a4ae-035aeade27a6\" (UID: \"7b4df096-e6ee-47df-a4ae-035aeade27a6\") " Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.553986 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b4df096-e6ee-47df-a4ae-035aeade27a6-combined-ca-bundle\") pod \"7b4df096-e6ee-47df-a4ae-035aeade27a6\" (UID: \"7b4df096-e6ee-47df-a4ae-035aeade27a6\") " Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.554037 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b4df096-e6ee-47df-a4ae-035aeade27a6-config-data\") pod \"7b4df096-e6ee-47df-a4ae-035aeade27a6\" (UID: \"7b4df096-e6ee-47df-a4ae-035aeade27a6\") " Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.554089 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c-combined-ca-bundle\") pod \"0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c\" (UID: \"0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c\") " Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.554134 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1899d96-c3b2-415c-b1fd-7c2847da4370-combined-ca-bundle\") pod \"b1899d96-c3b2-415c-b1fd-7c2847da4370\" (UID: \"b1899d96-c3b2-415c-b1fd-7c2847da4370\") " Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.555210 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b4df096-e6ee-47df-a4ae-035aeade27a6-logs" (OuterVolumeSpecName: "logs") pod "7b4df096-e6ee-47df-a4ae-035aeade27a6" (UID: "7b4df096-e6ee-47df-a4ae-035aeade27a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.559169 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b4df096-e6ee-47df-a4ae-035aeade27a6-kube-api-access-qw8gn" (OuterVolumeSpecName: "kube-api-access-qw8gn") pod "7b4df096-e6ee-47df-a4ae-035aeade27a6" (UID: "7b4df096-e6ee-47df-a4ae-035aeade27a6"). InnerVolumeSpecName "kube-api-access-qw8gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.559586 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b4df096-e6ee-47df-a4ae-035aeade27a6-scripts" (OuterVolumeSpecName: "scripts") pod "7b4df096-e6ee-47df-a4ae-035aeade27a6" (UID: "7b4df096-e6ee-47df-a4ae-035aeade27a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.559961 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1899d96-c3b2-415c-b1fd-7c2847da4370-kube-api-access-zvndh" (OuterVolumeSpecName: "kube-api-access-zvndh") pod "b1899d96-c3b2-415c-b1fd-7c2847da4370" (UID: "b1899d96-c3b2-415c-b1fd-7c2847da4370"). InnerVolumeSpecName "kube-api-access-zvndh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.579866 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c" (UID: "0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.581285 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c-kube-api-access-4jksk" (OuterVolumeSpecName: "kube-api-access-4jksk") pod "0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c" (UID: "0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c"). InnerVolumeSpecName "kube-api-access-4jksk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.590749 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b4df096-e6ee-47df-a4ae-035aeade27a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b4df096-e6ee-47df-a4ae-035aeade27a6" (UID: "7b4df096-e6ee-47df-a4ae-035aeade27a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.593022 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c" (UID: "0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.602021 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b4df096-e6ee-47df-a4ae-035aeade27a6-config-data" (OuterVolumeSpecName: "config-data") pod "7b4df096-e6ee-47df-a4ae-035aeade27a6" (UID: "7b4df096-e6ee-47df-a4ae-035aeade27a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.603641 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1899d96-c3b2-415c-b1fd-7c2847da4370-config" (OuterVolumeSpecName: "config") pod "b1899d96-c3b2-415c-b1fd-7c2847da4370" (UID: "b1899d96-c3b2-415c-b1fd-7c2847da4370"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.622410 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1899d96-c3b2-415c-b1fd-7c2847da4370-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1899d96-c3b2-415c-b1fd-7c2847da4370" (UID: "b1899d96-c3b2-415c-b1fd-7c2847da4370"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.624098 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bks65" event={"ID":"b1899d96-c3b2-415c-b1fd-7c2847da4370","Type":"ContainerDied","Data":"5267c48307419dbed415895a02f8b32125c473b1893b803c0bd37456075a80ce"} Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.624134 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5267c48307419dbed415895a02f8b32125c473b1893b803c0bd37456075a80ce" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.624138 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bks65" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.626810 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acd18657-f02e-4b2f-8ec6-e46b2408e720","Type":"ContainerStarted","Data":"df47d644831e788ecfaf9ef87a2766c5b00ea6efd48a54d1874c6dfa8fa44abd"} Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.628281 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wwspw" event={"ID":"7b4df096-e6ee-47df-a4ae-035aeade27a6","Type":"ContainerDied","Data":"328c9ee06832297cb6657c7218bd7cc85970b8be63ed6a69e57cf9d9abdae835"} Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.628341 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="328c9ee06832297cb6657c7218bd7cc85970b8be63ed6a69e57cf9d9abdae835" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.628307 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wwspw" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.629777 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-c9rbf" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.629958 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-c9rbf" event={"ID":"0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c","Type":"ContainerDied","Data":"c0fd535b2a0707eb2f6c0883417a35643945311a7657e29f12851a1cb8c15df6"} Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.629980 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0fd535b2a0707eb2f6c0883417a35643945311a7657e29f12851a1cb8c15df6" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.657686 5036 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b4df096-e6ee-47df-a4ae-035aeade27a6-logs\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.658148 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qw8gn\" (UniqueName: \"kubernetes.io/projected/7b4df096-e6ee-47df-a4ae-035aeade27a6-kube-api-access-qw8gn\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.658161 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b4df096-e6ee-47df-a4ae-035aeade27a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.658170 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b4df096-e6ee-47df-a4ae-035aeade27a6-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.658202 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.658213 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1899d96-c3b2-415c-b1fd-7c2847da4370-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.658222 5036 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.658231 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvndh\" (UniqueName: \"kubernetes.io/projected/b1899d96-c3b2-415c-b1fd-7c2847da4370-kube-api-access-zvndh\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.658240 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jksk\" (UniqueName: \"kubernetes.io/projected/0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c-kube-api-access-4jksk\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.658249 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b1899d96-c3b2-415c-b1fd-7c2847da4370-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.658257 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b4df096-e6ee-47df-a4ae-035aeade27a6-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.827542 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6ffbbc4bd-swcjc"] Jan 10 16:44:23 crc kubenswrapper[5036]: E0110 16:44:23.827876 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1899d96-c3b2-415c-b1fd-7c2847da4370" containerName="neutron-db-sync" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.827887 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1899d96-c3b2-415c-b1fd-7c2847da4370" containerName="neutron-db-sync" Jan 10 16:44:23 crc kubenswrapper[5036]: E0110 16:44:23.827899 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba71d9fa-3872-4975-8f46-767f96064411" containerName="dnsmasq-dns" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.827907 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba71d9fa-3872-4975-8f46-767f96064411" containerName="dnsmasq-dns" Jan 10 16:44:23 crc kubenswrapper[5036]: E0110 16:44:23.827921 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c" containerName="barbican-db-sync" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.827927 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c" containerName="barbican-db-sync" Jan 10 16:44:23 crc kubenswrapper[5036]: E0110 16:44:23.827940 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b4df096-e6ee-47df-a4ae-035aeade27a6" containerName="placement-db-sync" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.827946 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b4df096-e6ee-47df-a4ae-035aeade27a6" containerName="placement-db-sync" Jan 10 16:44:23 crc kubenswrapper[5036]: E0110 16:44:23.827960 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba71d9fa-3872-4975-8f46-767f96064411" containerName="init" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.827966 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba71d9fa-3872-4975-8f46-767f96064411" containerName="init" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.828102 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba71d9fa-3872-4975-8f46-767f96064411" containerName="dnsmasq-dns" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.828119 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b4df096-e6ee-47df-a4ae-035aeade27a6" containerName="placement-db-sync" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.828127 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1899d96-c3b2-415c-b1fd-7c2847da4370" containerName="neutron-db-sync" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.828136 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c" containerName="barbican-db-sync" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.828986 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.833197 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.833283 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.833205 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.833474 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-zzv57" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.843949 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.877973 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6ffbbc4bd-swcjc"] Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.944182 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-77cbb79454-h7btf"] Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.945598 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-77cbb79454-h7btf" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.950605 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.950923 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-q796t" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.952641 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.967632 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bxwj\" (UniqueName: \"kubernetes.io/projected/5b379ab6-fc59-475f-909f-4f71e7184803-kube-api-access-4bxwj\") pod \"placement-6ffbbc4bd-swcjc\" (UID: \"5b379ab6-fc59-475f-909f-4f71e7184803\") " pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.967695 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b379ab6-fc59-475f-909f-4f71e7184803-combined-ca-bundle\") pod \"placement-6ffbbc4bd-swcjc\" (UID: \"5b379ab6-fc59-475f-909f-4f71e7184803\") " pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.967717 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b379ab6-fc59-475f-909f-4f71e7184803-scripts\") pod \"placement-6ffbbc4bd-swcjc\" (UID: \"5b379ab6-fc59-475f-909f-4f71e7184803\") " pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.967738 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b379ab6-fc59-475f-909f-4f71e7184803-internal-tls-certs\") pod \"placement-6ffbbc4bd-swcjc\" (UID: \"5b379ab6-fc59-475f-909f-4f71e7184803\") " pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.967779 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b379ab6-fc59-475f-909f-4f71e7184803-logs\") pod \"placement-6ffbbc4bd-swcjc\" (UID: \"5b379ab6-fc59-475f-909f-4f71e7184803\") " pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.967818 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b379ab6-fc59-475f-909f-4f71e7184803-public-tls-certs\") pod \"placement-6ffbbc4bd-swcjc\" (UID: \"5b379ab6-fc59-475f-909f-4f71e7184803\") " pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.967876 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b379ab6-fc59-475f-909f-4f71e7184803-config-data\") pod \"placement-6ffbbc4bd-swcjc\" (UID: \"5b379ab6-fc59-475f-909f-4f71e7184803\") " pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.991746 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-55c7665d4c-brkx9"] Jan 10 16:44:23 crc kubenswrapper[5036]: I0110 16:44:23.993398 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-55c7665d4c-brkx9" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.000313 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.014014 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-55c7665d4c-brkx9"] Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.043463 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-77cbb79454-h7btf"] Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.052425 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cngfk" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.069661 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/731670b8-d6af-49c5-b8cf-ddeafb2462c7-logs\") pod \"barbican-keystone-listener-77cbb79454-h7btf\" (UID: \"731670b8-d6af-49c5-b8cf-ddeafb2462c7\") " pod="openstack/barbican-keystone-listener-77cbb79454-h7btf" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.069891 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/608bfa08-ff8b-4f06-bc62-e456f9e2005c-config-data\") pod \"barbican-worker-55c7665d4c-brkx9\" (UID: \"608bfa08-ff8b-4f06-bc62-e456f9e2005c\") " pod="openstack/barbican-worker-55c7665d4c-brkx9" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.069920 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b379ab6-fc59-475f-909f-4f71e7184803-public-tls-certs\") pod \"placement-6ffbbc4bd-swcjc\" (UID: \"5b379ab6-fc59-475f-909f-4f71e7184803\") " pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.069946 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/731670b8-d6af-49c5-b8cf-ddeafb2462c7-config-data\") pod \"barbican-keystone-listener-77cbb79454-h7btf\" (UID: \"731670b8-d6af-49c5-b8cf-ddeafb2462c7\") " pod="openstack/barbican-keystone-listener-77cbb79454-h7btf" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.070057 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmrgz\" (UniqueName: \"kubernetes.io/projected/608bfa08-ff8b-4f06-bc62-e456f9e2005c-kube-api-access-wmrgz\") pod \"barbican-worker-55c7665d4c-brkx9\" (UID: \"608bfa08-ff8b-4f06-bc62-e456f9e2005c\") " pod="openstack/barbican-worker-55c7665d4c-brkx9" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.070142 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b379ab6-fc59-475f-909f-4f71e7184803-config-data\") pod \"placement-6ffbbc4bd-swcjc\" (UID: \"5b379ab6-fc59-475f-909f-4f71e7184803\") " pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.070172 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/731670b8-d6af-49c5-b8cf-ddeafb2462c7-config-data-custom\") pod \"barbican-keystone-listener-77cbb79454-h7btf\" (UID: \"731670b8-d6af-49c5-b8cf-ddeafb2462c7\") " pod="openstack/barbican-keystone-listener-77cbb79454-h7btf" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.070198 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bxwj\" (UniqueName: \"kubernetes.io/projected/5b379ab6-fc59-475f-909f-4f71e7184803-kube-api-access-4bxwj\") pod \"placement-6ffbbc4bd-swcjc\" (UID: \"5b379ab6-fc59-475f-909f-4f71e7184803\") " pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.070232 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/731670b8-d6af-49c5-b8cf-ddeafb2462c7-combined-ca-bundle\") pod \"barbican-keystone-listener-77cbb79454-h7btf\" (UID: \"731670b8-d6af-49c5-b8cf-ddeafb2462c7\") " pod="openstack/barbican-keystone-listener-77cbb79454-h7btf" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.070310 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b379ab6-fc59-475f-909f-4f71e7184803-combined-ca-bundle\") pod \"placement-6ffbbc4bd-swcjc\" (UID: \"5b379ab6-fc59-475f-909f-4f71e7184803\") " pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.070344 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b379ab6-fc59-475f-909f-4f71e7184803-scripts\") pod \"placement-6ffbbc4bd-swcjc\" (UID: \"5b379ab6-fc59-475f-909f-4f71e7184803\") " pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.070368 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/608bfa08-ff8b-4f06-bc62-e456f9e2005c-config-data-custom\") pod \"barbican-worker-55c7665d4c-brkx9\" (UID: \"608bfa08-ff8b-4f06-bc62-e456f9e2005c\") " pod="openstack/barbican-worker-55c7665d4c-brkx9" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.070390 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/608bfa08-ff8b-4f06-bc62-e456f9e2005c-logs\") pod \"barbican-worker-55c7665d4c-brkx9\" (UID: \"608bfa08-ff8b-4f06-bc62-e456f9e2005c\") " pod="openstack/barbican-worker-55c7665d4c-brkx9" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.070419 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b379ab6-fc59-475f-909f-4f71e7184803-internal-tls-certs\") pod \"placement-6ffbbc4bd-swcjc\" (UID: \"5b379ab6-fc59-475f-909f-4f71e7184803\") " pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.070446 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlmj8\" (UniqueName: \"kubernetes.io/projected/731670b8-d6af-49c5-b8cf-ddeafb2462c7-kube-api-access-vlmj8\") pod \"barbican-keystone-listener-77cbb79454-h7btf\" (UID: \"731670b8-d6af-49c5-b8cf-ddeafb2462c7\") " pod="openstack/barbican-keystone-listener-77cbb79454-h7btf" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.070478 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/608bfa08-ff8b-4f06-bc62-e456f9e2005c-combined-ca-bundle\") pod \"barbican-worker-55c7665d4c-brkx9\" (UID: \"608bfa08-ff8b-4f06-bc62-e456f9e2005c\") " pod="openstack/barbican-worker-55c7665d4c-brkx9" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.070572 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b379ab6-fc59-475f-909f-4f71e7184803-logs\") pod \"placement-6ffbbc4bd-swcjc\" (UID: \"5b379ab6-fc59-475f-909f-4f71e7184803\") " pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.071014 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b379ab6-fc59-475f-909f-4f71e7184803-logs\") pod \"placement-6ffbbc4bd-swcjc\" (UID: \"5b379ab6-fc59-475f-909f-4f71e7184803\") " pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.078507 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b379ab6-fc59-475f-909f-4f71e7184803-combined-ca-bundle\") pod \"placement-6ffbbc4bd-swcjc\" (UID: \"5b379ab6-fc59-475f-909f-4f71e7184803\") " pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.083074 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b379ab6-fc59-475f-909f-4f71e7184803-scripts\") pod \"placement-6ffbbc4bd-swcjc\" (UID: \"5b379ab6-fc59-475f-909f-4f71e7184803\") " pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.087239 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b379ab6-fc59-475f-909f-4f71e7184803-public-tls-certs\") pod \"placement-6ffbbc4bd-swcjc\" (UID: \"5b379ab6-fc59-475f-909f-4f71e7184803\") " pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.088166 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b379ab6-fc59-475f-909f-4f71e7184803-internal-tls-certs\") pod \"placement-6ffbbc4bd-swcjc\" (UID: \"5b379ab6-fc59-475f-909f-4f71e7184803\") " pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.096305 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b379ab6-fc59-475f-909f-4f71e7184803-config-data\") pod \"placement-6ffbbc4bd-swcjc\" (UID: \"5b379ab6-fc59-475f-909f-4f71e7184803\") " pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.111880 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bxwj\" (UniqueName: \"kubernetes.io/projected/5b379ab6-fc59-475f-909f-4f71e7184803-kube-api-access-4bxwj\") pod \"placement-6ffbbc4bd-swcjc\" (UID: \"5b379ab6-fc59-475f-909f-4f71e7184803\") " pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.131030 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-zlbkl"] Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.131317 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" podUID="f7e26dab-ee9d-414e-8447-738d9f153999" containerName="dnsmasq-dns" containerID="cri-o://eb5d89ff3b5a81f600b0f17c4b84f5c3af033a32a2b613964f1c3bc32fe29fdb" gracePeriod=10 Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.135054 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.142867 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-699df9757c-gcfrt"] Jan 10 16:44:24 crc kubenswrapper[5036]: E0110 16:44:24.143182 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402" containerName="keystone-bootstrap" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.143198 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402" containerName="keystone-bootstrap" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.143353 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402" containerName="keystone-bootstrap" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.147423 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699df9757c-gcfrt" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.170761 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-69f754595d-jrtgk"] Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.172315 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69f754595d-jrtgk" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.174720 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-combined-ca-bundle\") pod \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\" (UID: \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\") " Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.174744 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.174857 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-config-data\") pod \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\" (UID: \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\") " Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.174901 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-scripts\") pod \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\" (UID: \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\") " Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.174984 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl5qc\" (UniqueName: \"kubernetes.io/projected/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-kube-api-access-bl5qc\") pod \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\" (UID: \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\") " Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.175029 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-credential-keys\") pod \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\" (UID: \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\") " Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.175057 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-fernet-keys\") pod \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\" (UID: \"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402\") " Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.175252 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/731670b8-d6af-49c5-b8cf-ddeafb2462c7-logs\") pod \"barbican-keystone-listener-77cbb79454-h7btf\" (UID: \"731670b8-d6af-49c5-b8cf-ddeafb2462c7\") " pod="openstack/barbican-keystone-listener-77cbb79454-h7btf" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.175280 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/608bfa08-ff8b-4f06-bc62-e456f9e2005c-config-data\") pod \"barbican-worker-55c7665d4c-brkx9\" (UID: \"608bfa08-ff8b-4f06-bc62-e456f9e2005c\") " pod="openstack/barbican-worker-55c7665d4c-brkx9" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.175299 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/731670b8-d6af-49c5-b8cf-ddeafb2462c7-config-data\") pod \"barbican-keystone-listener-77cbb79454-h7btf\" (UID: \"731670b8-d6af-49c5-b8cf-ddeafb2462c7\") " pod="openstack/barbican-keystone-listener-77cbb79454-h7btf" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.175326 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmrgz\" (UniqueName: \"kubernetes.io/projected/608bfa08-ff8b-4f06-bc62-e456f9e2005c-kube-api-access-wmrgz\") pod \"barbican-worker-55c7665d4c-brkx9\" (UID: \"608bfa08-ff8b-4f06-bc62-e456f9e2005c\") " pod="openstack/barbican-worker-55c7665d4c-brkx9" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.175377 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/731670b8-d6af-49c5-b8cf-ddeafb2462c7-config-data-custom\") pod \"barbican-keystone-listener-77cbb79454-h7btf\" (UID: \"731670b8-d6af-49c5-b8cf-ddeafb2462c7\") " pod="openstack/barbican-keystone-listener-77cbb79454-h7btf" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.175398 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/731670b8-d6af-49c5-b8cf-ddeafb2462c7-combined-ca-bundle\") pod \"barbican-keystone-listener-77cbb79454-h7btf\" (UID: \"731670b8-d6af-49c5-b8cf-ddeafb2462c7\") " pod="openstack/barbican-keystone-listener-77cbb79454-h7btf" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.175420 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/608bfa08-ff8b-4f06-bc62-e456f9e2005c-config-data-custom\") pod \"barbican-worker-55c7665d4c-brkx9\" (UID: \"608bfa08-ff8b-4f06-bc62-e456f9e2005c\") " pod="openstack/barbican-worker-55c7665d4c-brkx9" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.175435 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/608bfa08-ff8b-4f06-bc62-e456f9e2005c-logs\") pod \"barbican-worker-55c7665d4c-brkx9\" (UID: \"608bfa08-ff8b-4f06-bc62-e456f9e2005c\") " pod="openstack/barbican-worker-55c7665d4c-brkx9" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.175454 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlmj8\" (UniqueName: \"kubernetes.io/projected/731670b8-d6af-49c5-b8cf-ddeafb2462c7-kube-api-access-vlmj8\") pod \"barbican-keystone-listener-77cbb79454-h7btf\" (UID: \"731670b8-d6af-49c5-b8cf-ddeafb2462c7\") " pod="openstack/barbican-keystone-listener-77cbb79454-h7btf" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.175473 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/608bfa08-ff8b-4f06-bc62-e456f9e2005c-combined-ca-bundle\") pod \"barbican-worker-55c7665d4c-brkx9\" (UID: \"608bfa08-ff8b-4f06-bc62-e456f9e2005c\") " pod="openstack/barbican-worker-55c7665d4c-brkx9" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.183995 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/731670b8-d6af-49c5-b8cf-ddeafb2462c7-logs\") pod \"barbican-keystone-listener-77cbb79454-h7btf\" (UID: \"731670b8-d6af-49c5-b8cf-ddeafb2462c7\") " pod="openstack/barbican-keystone-listener-77cbb79454-h7btf" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.185670 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/608bfa08-ff8b-4f06-bc62-e456f9e2005c-config-data\") pod \"barbican-worker-55c7665d4c-brkx9\" (UID: \"608bfa08-ff8b-4f06-bc62-e456f9e2005c\") " pod="openstack/barbican-worker-55c7665d4c-brkx9" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.203305 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-kube-api-access-bl5qc" (OuterVolumeSpecName: "kube-api-access-bl5qc") pod "2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402" (UID: "2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402"). InnerVolumeSpecName "kube-api-access-bl5qc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.203354 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-scripts" (OuterVolumeSpecName: "scripts") pod "2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402" (UID: "2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.203544 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/608bfa08-ff8b-4f06-bc62-e456f9e2005c-logs\") pod \"barbican-worker-55c7665d4c-brkx9\" (UID: \"608bfa08-ff8b-4f06-bc62-e456f9e2005c\") " pod="openstack/barbican-worker-55c7665d4c-brkx9" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.203699 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402" (UID: "2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.204035 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/731670b8-d6af-49c5-b8cf-ddeafb2462c7-combined-ca-bundle\") pod \"barbican-keystone-listener-77cbb79454-h7btf\" (UID: \"731670b8-d6af-49c5-b8cf-ddeafb2462c7\") " pod="openstack/barbican-keystone-listener-77cbb79454-h7btf" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.206336 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.216409 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/608bfa08-ff8b-4f06-bc62-e456f9e2005c-config-data-custom\") pod \"barbican-worker-55c7665d4c-brkx9\" (UID: \"608bfa08-ff8b-4f06-bc62-e456f9e2005c\") " pod="openstack/barbican-worker-55c7665d4c-brkx9" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.220595 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" podUID="f7e26dab-ee9d-414e-8447-738d9f153999" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.139:5353: connect: connection refused" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.224507 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402" (UID: "2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.225473 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/608bfa08-ff8b-4f06-bc62-e456f9e2005c-combined-ca-bundle\") pod \"barbican-worker-55c7665d4c-brkx9\" (UID: \"608bfa08-ff8b-4f06-bc62-e456f9e2005c\") " pod="openstack/barbican-worker-55c7665d4c-brkx9" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.241410 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/731670b8-d6af-49c5-b8cf-ddeafb2462c7-config-data\") pod \"barbican-keystone-listener-77cbb79454-h7btf\" (UID: \"731670b8-d6af-49c5-b8cf-ddeafb2462c7\") " pod="openstack/barbican-keystone-listener-77cbb79454-h7btf" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.244666 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmrgz\" (UniqueName: \"kubernetes.io/projected/608bfa08-ff8b-4f06-bc62-e456f9e2005c-kube-api-access-wmrgz\") pod \"barbican-worker-55c7665d4c-brkx9\" (UID: \"608bfa08-ff8b-4f06-bc62-e456f9e2005c\") " pod="openstack/barbican-worker-55c7665d4c-brkx9" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.244753 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-gcfrt"] Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.245674 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/731670b8-d6af-49c5-b8cf-ddeafb2462c7-config-data-custom\") pod \"barbican-keystone-listener-77cbb79454-h7btf\" (UID: \"731670b8-d6af-49c5-b8cf-ddeafb2462c7\") " pod="openstack/barbican-keystone-listener-77cbb79454-h7btf" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.258831 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402" (UID: "2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.259484 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlmj8\" (UniqueName: \"kubernetes.io/projected/731670b8-d6af-49c5-b8cf-ddeafb2462c7-kube-api-access-vlmj8\") pod \"barbican-keystone-listener-77cbb79454-h7btf\" (UID: \"731670b8-d6af-49c5-b8cf-ddeafb2462c7\") " pod="openstack/barbican-keystone-listener-77cbb79454-h7btf" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.271716 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-77cbb79454-h7btf" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.276410 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dmgt\" (UniqueName: \"kubernetes.io/projected/00621c6a-f181-4e18-a5a3-57d28c5153a2-kube-api-access-8dmgt\") pod \"dnsmasq-dns-699df9757c-gcfrt\" (UID: \"00621c6a-f181-4e18-a5a3-57d28c5153a2\") " pod="openstack/dnsmasq-dns-699df9757c-gcfrt" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.276453 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00621c6a-f181-4e18-a5a3-57d28c5153a2-ovsdbserver-sb\") pod \"dnsmasq-dns-699df9757c-gcfrt\" (UID: \"00621c6a-f181-4e18-a5a3-57d28c5153a2\") " pod="openstack/dnsmasq-dns-699df9757c-gcfrt" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.277125 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aa25bba-4193-480c-91ab-2dd659103e99-logs\") pod \"barbican-api-69f754595d-jrtgk\" (UID: \"8aa25bba-4193-480c-91ab-2dd659103e99\") " pod="openstack/barbican-api-69f754595d-jrtgk" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.277157 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aa25bba-4193-480c-91ab-2dd659103e99-combined-ca-bundle\") pod \"barbican-api-69f754595d-jrtgk\" (UID: \"8aa25bba-4193-480c-91ab-2dd659103e99\") " pod="openstack/barbican-api-69f754595d-jrtgk" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.277206 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aa25bba-4193-480c-91ab-2dd659103e99-config-data\") pod \"barbican-api-69f754595d-jrtgk\" (UID: \"8aa25bba-4193-480c-91ab-2dd659103e99\") " pod="openstack/barbican-api-69f754595d-jrtgk" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.277225 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00621c6a-f181-4e18-a5a3-57d28c5153a2-config\") pod \"dnsmasq-dns-699df9757c-gcfrt\" (UID: \"00621c6a-f181-4e18-a5a3-57d28c5153a2\") " pod="openstack/dnsmasq-dns-699df9757c-gcfrt" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.277245 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8aa25bba-4193-480c-91ab-2dd659103e99-config-data-custom\") pod \"barbican-api-69f754595d-jrtgk\" (UID: \"8aa25bba-4193-480c-91ab-2dd659103e99\") " pod="openstack/barbican-api-69f754595d-jrtgk" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.277310 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00621c6a-f181-4e18-a5a3-57d28c5153a2-dns-svc\") pod \"dnsmasq-dns-699df9757c-gcfrt\" (UID: \"00621c6a-f181-4e18-a5a3-57d28c5153a2\") " pod="openstack/dnsmasq-dns-699df9757c-gcfrt" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.277336 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00621c6a-f181-4e18-a5a3-57d28c5153a2-ovsdbserver-nb\") pod \"dnsmasq-dns-699df9757c-gcfrt\" (UID: \"00621c6a-f181-4e18-a5a3-57d28c5153a2\") " pod="openstack/dnsmasq-dns-699df9757c-gcfrt" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.277364 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk5c9\" (UniqueName: \"kubernetes.io/projected/8aa25bba-4193-480c-91ab-2dd659103e99-kube-api-access-xk5c9\") pod \"barbican-api-69f754595d-jrtgk\" (UID: \"8aa25bba-4193-480c-91ab-2dd659103e99\") " pod="openstack/barbican-api-69f754595d-jrtgk" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.277421 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl5qc\" (UniqueName: \"kubernetes.io/projected/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-kube-api-access-bl5qc\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.277432 5036 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.277446 5036 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.277457 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.277466 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.277549 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-69f754595d-jrtgk"] Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.333816 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-55c7665d4c-brkx9" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.378444 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00621c6a-f181-4e18-a5a3-57d28c5153a2-dns-svc\") pod \"dnsmasq-dns-699df9757c-gcfrt\" (UID: \"00621c6a-f181-4e18-a5a3-57d28c5153a2\") " pod="openstack/dnsmasq-dns-699df9757c-gcfrt" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.378494 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00621c6a-f181-4e18-a5a3-57d28c5153a2-ovsdbserver-nb\") pod \"dnsmasq-dns-699df9757c-gcfrt\" (UID: \"00621c6a-f181-4e18-a5a3-57d28c5153a2\") " pod="openstack/dnsmasq-dns-699df9757c-gcfrt" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.378532 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk5c9\" (UniqueName: \"kubernetes.io/projected/8aa25bba-4193-480c-91ab-2dd659103e99-kube-api-access-xk5c9\") pod \"barbican-api-69f754595d-jrtgk\" (UID: \"8aa25bba-4193-480c-91ab-2dd659103e99\") " pod="openstack/barbican-api-69f754595d-jrtgk" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.378577 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dmgt\" (UniqueName: \"kubernetes.io/projected/00621c6a-f181-4e18-a5a3-57d28c5153a2-kube-api-access-8dmgt\") pod \"dnsmasq-dns-699df9757c-gcfrt\" (UID: \"00621c6a-f181-4e18-a5a3-57d28c5153a2\") " pod="openstack/dnsmasq-dns-699df9757c-gcfrt" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.378611 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00621c6a-f181-4e18-a5a3-57d28c5153a2-ovsdbserver-sb\") pod \"dnsmasq-dns-699df9757c-gcfrt\" (UID: \"00621c6a-f181-4e18-a5a3-57d28c5153a2\") " pod="openstack/dnsmasq-dns-699df9757c-gcfrt" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.378709 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aa25bba-4193-480c-91ab-2dd659103e99-logs\") pod \"barbican-api-69f754595d-jrtgk\" (UID: \"8aa25bba-4193-480c-91ab-2dd659103e99\") " pod="openstack/barbican-api-69f754595d-jrtgk" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.378735 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aa25bba-4193-480c-91ab-2dd659103e99-combined-ca-bundle\") pod \"barbican-api-69f754595d-jrtgk\" (UID: \"8aa25bba-4193-480c-91ab-2dd659103e99\") " pod="openstack/barbican-api-69f754595d-jrtgk" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.378801 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aa25bba-4193-480c-91ab-2dd659103e99-config-data\") pod \"barbican-api-69f754595d-jrtgk\" (UID: \"8aa25bba-4193-480c-91ab-2dd659103e99\") " pod="openstack/barbican-api-69f754595d-jrtgk" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.378824 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00621c6a-f181-4e18-a5a3-57d28c5153a2-config\") pod \"dnsmasq-dns-699df9757c-gcfrt\" (UID: \"00621c6a-f181-4e18-a5a3-57d28c5153a2\") " pod="openstack/dnsmasq-dns-699df9757c-gcfrt" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.378847 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8aa25bba-4193-480c-91ab-2dd659103e99-config-data-custom\") pod \"barbican-api-69f754595d-jrtgk\" (UID: \"8aa25bba-4193-480c-91ab-2dd659103e99\") " pod="openstack/barbican-api-69f754595d-jrtgk" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.380665 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00621c6a-f181-4e18-a5a3-57d28c5153a2-ovsdbserver-nb\") pod \"dnsmasq-dns-699df9757c-gcfrt\" (UID: \"00621c6a-f181-4e18-a5a3-57d28c5153a2\") " pod="openstack/dnsmasq-dns-699df9757c-gcfrt" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.380665 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00621c6a-f181-4e18-a5a3-57d28c5153a2-config\") pod \"dnsmasq-dns-699df9757c-gcfrt\" (UID: \"00621c6a-f181-4e18-a5a3-57d28c5153a2\") " pod="openstack/dnsmasq-dns-699df9757c-gcfrt" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.381072 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aa25bba-4193-480c-91ab-2dd659103e99-logs\") pod \"barbican-api-69f754595d-jrtgk\" (UID: \"8aa25bba-4193-480c-91ab-2dd659103e99\") " pod="openstack/barbican-api-69f754595d-jrtgk" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.381344 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00621c6a-f181-4e18-a5a3-57d28c5153a2-dns-svc\") pod \"dnsmasq-dns-699df9757c-gcfrt\" (UID: \"00621c6a-f181-4e18-a5a3-57d28c5153a2\") " pod="openstack/dnsmasq-dns-699df9757c-gcfrt" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.381670 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00621c6a-f181-4e18-a5a3-57d28c5153a2-ovsdbserver-sb\") pod \"dnsmasq-dns-699df9757c-gcfrt\" (UID: \"00621c6a-f181-4e18-a5a3-57d28c5153a2\") " pod="openstack/dnsmasq-dns-699df9757c-gcfrt" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.385105 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aa25bba-4193-480c-91ab-2dd659103e99-combined-ca-bundle\") pod \"barbican-api-69f754595d-jrtgk\" (UID: \"8aa25bba-4193-480c-91ab-2dd659103e99\") " pod="openstack/barbican-api-69f754595d-jrtgk" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.387898 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8aa25bba-4193-480c-91ab-2dd659103e99-config-data-custom\") pod \"barbican-api-69f754595d-jrtgk\" (UID: \"8aa25bba-4193-480c-91ab-2dd659103e99\") " pod="openstack/barbican-api-69f754595d-jrtgk" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.389232 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aa25bba-4193-480c-91ab-2dd659103e99-config-data\") pod \"barbican-api-69f754595d-jrtgk\" (UID: \"8aa25bba-4193-480c-91ab-2dd659103e99\") " pod="openstack/barbican-api-69f754595d-jrtgk" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.417138 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-config-data" (OuterVolumeSpecName: "config-data") pod "2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402" (UID: "2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.437319 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk5c9\" (UniqueName: \"kubernetes.io/projected/8aa25bba-4193-480c-91ab-2dd659103e99-kube-api-access-xk5c9\") pod \"barbican-api-69f754595d-jrtgk\" (UID: \"8aa25bba-4193-480c-91ab-2dd659103e99\") " pod="openstack/barbican-api-69f754595d-jrtgk" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.440408 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dmgt\" (UniqueName: \"kubernetes.io/projected/00621c6a-f181-4e18-a5a3-57d28c5153a2-kube-api-access-8dmgt\") pod \"dnsmasq-dns-699df9757c-gcfrt\" (UID: \"00621c6a-f181-4e18-a5a3-57d28c5153a2\") " pod="openstack/dnsmasq-dns-699df9757c-gcfrt" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.480318 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.505086 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699df9757c-gcfrt" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.558274 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69f754595d-jrtgk" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.687848 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-gcfrt"] Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.713249 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cngfk" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.713994 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cngfk" event={"ID":"2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402","Type":"ContainerDied","Data":"683c830bf043b821349a486826fdb12d8d397416419b843579341ddaedf209cf"} Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.714023 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="683c830bf043b821349a486826fdb12d8d397416419b843579341ddaedf209cf" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.754967 5036 generic.go:334] "Generic (PLEG): container finished" podID="f7e26dab-ee9d-414e-8447-738d9f153999" containerID="eb5d89ff3b5a81f600b0f17c4b84f5c3af033a32a2b613964f1c3bc32fe29fdb" exitCode=0 Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.755018 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" event={"ID":"f7e26dab-ee9d-414e-8447-738d9f153999","Type":"ContainerDied","Data":"eb5d89ff3b5a81f600b0f17c4b84f5c3af033a32a2b613964f1c3bc32fe29fdb"} Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.761571 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-hzq7n"] Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.762892 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.812765 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-hzq7n"] Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.833079 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6f7c9c789b-dj95d"] Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.834815 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.839443 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.839619 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.839903 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.840473 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.840586 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.845607 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-n6pzn" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.869631 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6f7c9c789b-dj95d"] Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.892647 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75115cba-8c6e-4c48-b71c-0277c43f446c-internal-tls-certs\") pod \"keystone-6f7c9c789b-dj95d\" (UID: \"75115cba-8c6e-4c48-b71c-0277c43f446c\") " pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.892719 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75115cba-8c6e-4c48-b71c-0277c43f446c-config-data\") pod \"keystone-6f7c9c789b-dj95d\" (UID: \"75115cba-8c6e-4c48-b71c-0277c43f446c\") " pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.892751 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mcp4\" (UniqueName: \"kubernetes.io/projected/75115cba-8c6e-4c48-b71c-0277c43f446c-kube-api-access-6mcp4\") pod \"keystone-6f7c9c789b-dj95d\" (UID: \"75115cba-8c6e-4c48-b71c-0277c43f446c\") " pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.892788 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0567814-352b-4f05-8175-a103c0f98d0b-dns-svc\") pod \"dnsmasq-dns-6bb684768f-hzq7n\" (UID: \"d0567814-352b-4f05-8175-a103c0f98d0b\") " pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.892812 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0567814-352b-4f05-8175-a103c0f98d0b-config\") pod \"dnsmasq-dns-6bb684768f-hzq7n\" (UID: \"d0567814-352b-4f05-8175-a103c0f98d0b\") " pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.892839 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75115cba-8c6e-4c48-b71c-0277c43f446c-scripts\") pod \"keystone-6f7c9c789b-dj95d\" (UID: \"75115cba-8c6e-4c48-b71c-0277c43f446c\") " pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.892884 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75115cba-8c6e-4c48-b71c-0277c43f446c-combined-ca-bundle\") pod \"keystone-6f7c9c789b-dj95d\" (UID: \"75115cba-8c6e-4c48-b71c-0277c43f446c\") " pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.892915 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/75115cba-8c6e-4c48-b71c-0277c43f446c-credential-keys\") pod \"keystone-6f7c9c789b-dj95d\" (UID: \"75115cba-8c6e-4c48-b71c-0277c43f446c\") " pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.892935 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0567814-352b-4f05-8175-a103c0f98d0b-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-hzq7n\" (UID: \"d0567814-352b-4f05-8175-a103c0f98d0b\") " pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.892967 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0567814-352b-4f05-8175-a103c0f98d0b-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-hzq7n\" (UID: \"d0567814-352b-4f05-8175-a103c0f98d0b\") " pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.892997 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/75115cba-8c6e-4c48-b71c-0277c43f446c-fernet-keys\") pod \"keystone-6f7c9c789b-dj95d\" (UID: \"75115cba-8c6e-4c48-b71c-0277c43f446c\") " pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.893024 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hl6p\" (UniqueName: \"kubernetes.io/projected/d0567814-352b-4f05-8175-a103c0f98d0b-kube-api-access-4hl6p\") pod \"dnsmasq-dns-6bb684768f-hzq7n\" (UID: \"d0567814-352b-4f05-8175-a103c0f98d0b\") " pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.893082 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75115cba-8c6e-4c48-b71c-0277c43f446c-public-tls-certs\") pod \"keystone-6f7c9c789b-dj95d\" (UID: \"75115cba-8c6e-4c48-b71c-0277c43f446c\") " pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.893182 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-547c4cc84d-fr8g2"] Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.894574 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-547c4cc84d-fr8g2" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.897053 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-bj2fv" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.897241 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.897428 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.897585 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.915259 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-547c4cc84d-fr8g2"] Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.994425 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75115cba-8c6e-4c48-b71c-0277c43f446c-combined-ca-bundle\") pod \"keystone-6f7c9c789b-dj95d\" (UID: \"75115cba-8c6e-4c48-b71c-0277c43f446c\") " pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.994475 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/75115cba-8c6e-4c48-b71c-0277c43f446c-credential-keys\") pod \"keystone-6f7c9c789b-dj95d\" (UID: \"75115cba-8c6e-4c48-b71c-0277c43f446c\") " pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.994499 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0567814-352b-4f05-8175-a103c0f98d0b-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-hzq7n\" (UID: \"d0567814-352b-4f05-8175-a103c0f98d0b\") " pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.994551 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0567814-352b-4f05-8175-a103c0f98d0b-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-hzq7n\" (UID: \"d0567814-352b-4f05-8175-a103c0f98d0b\") " pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.994585 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/75115cba-8c6e-4c48-b71c-0277c43f446c-fernet-keys\") pod \"keystone-6f7c9c789b-dj95d\" (UID: \"75115cba-8c6e-4c48-b71c-0277c43f446c\") " pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.994608 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hl6p\" (UniqueName: \"kubernetes.io/projected/d0567814-352b-4f05-8175-a103c0f98d0b-kube-api-access-4hl6p\") pod \"dnsmasq-dns-6bb684768f-hzq7n\" (UID: \"d0567814-352b-4f05-8175-a103c0f98d0b\") " pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.994641 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/58348536-72b1-4f0f-b836-6ff265673fa0-config\") pod \"neutron-547c4cc84d-fr8g2\" (UID: \"58348536-72b1-4f0f-b836-6ff265673fa0\") " pod="openstack/neutron-547c4cc84d-fr8g2" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.994668 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58348536-72b1-4f0f-b836-6ff265673fa0-ovndb-tls-certs\") pod \"neutron-547c4cc84d-fr8g2\" (UID: \"58348536-72b1-4f0f-b836-6ff265673fa0\") " pod="openstack/neutron-547c4cc84d-fr8g2" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.994716 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75115cba-8c6e-4c48-b71c-0277c43f446c-public-tls-certs\") pod \"keystone-6f7c9c789b-dj95d\" (UID: \"75115cba-8c6e-4c48-b71c-0277c43f446c\") " pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.994742 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58348536-72b1-4f0f-b836-6ff265673fa0-combined-ca-bundle\") pod \"neutron-547c4cc84d-fr8g2\" (UID: \"58348536-72b1-4f0f-b836-6ff265673fa0\") " pod="openstack/neutron-547c4cc84d-fr8g2" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.994766 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75115cba-8c6e-4c48-b71c-0277c43f446c-internal-tls-certs\") pod \"keystone-6f7c9c789b-dj95d\" (UID: \"75115cba-8c6e-4c48-b71c-0277c43f446c\") " pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.994786 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75115cba-8c6e-4c48-b71c-0277c43f446c-config-data\") pod \"keystone-6f7c9c789b-dj95d\" (UID: \"75115cba-8c6e-4c48-b71c-0277c43f446c\") " pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.994805 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chsqp\" (UniqueName: \"kubernetes.io/projected/58348536-72b1-4f0f-b836-6ff265673fa0-kube-api-access-chsqp\") pod \"neutron-547c4cc84d-fr8g2\" (UID: \"58348536-72b1-4f0f-b836-6ff265673fa0\") " pod="openstack/neutron-547c4cc84d-fr8g2" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.994828 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mcp4\" (UniqueName: \"kubernetes.io/projected/75115cba-8c6e-4c48-b71c-0277c43f446c-kube-api-access-6mcp4\") pod \"keystone-6f7c9c789b-dj95d\" (UID: \"75115cba-8c6e-4c48-b71c-0277c43f446c\") " pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.994846 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/58348536-72b1-4f0f-b836-6ff265673fa0-httpd-config\") pod \"neutron-547c4cc84d-fr8g2\" (UID: \"58348536-72b1-4f0f-b836-6ff265673fa0\") " pod="openstack/neutron-547c4cc84d-fr8g2" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.994881 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0567814-352b-4f05-8175-a103c0f98d0b-dns-svc\") pod \"dnsmasq-dns-6bb684768f-hzq7n\" (UID: \"d0567814-352b-4f05-8175-a103c0f98d0b\") " pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.994905 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0567814-352b-4f05-8175-a103c0f98d0b-config\") pod \"dnsmasq-dns-6bb684768f-hzq7n\" (UID: \"d0567814-352b-4f05-8175-a103c0f98d0b\") " pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.994929 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75115cba-8c6e-4c48-b71c-0277c43f446c-scripts\") pod \"keystone-6f7c9c789b-dj95d\" (UID: \"75115cba-8c6e-4c48-b71c-0277c43f446c\") " pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:24 crc kubenswrapper[5036]: I0110 16:44:24.999865 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0567814-352b-4f05-8175-a103c0f98d0b-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-hzq7n\" (UID: \"d0567814-352b-4f05-8175-a103c0f98d0b\") " pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.002194 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0567814-352b-4f05-8175-a103c0f98d0b-dns-svc\") pod \"dnsmasq-dns-6bb684768f-hzq7n\" (UID: \"d0567814-352b-4f05-8175-a103c0f98d0b\") " pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.007902 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0567814-352b-4f05-8175-a103c0f98d0b-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-hzq7n\" (UID: \"d0567814-352b-4f05-8175-a103c0f98d0b\") " pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.008071 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/75115cba-8c6e-4c48-b71c-0277c43f446c-credential-keys\") pod \"keystone-6f7c9c789b-dj95d\" (UID: \"75115cba-8c6e-4c48-b71c-0277c43f446c\") " pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.008144 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.008459 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0567814-352b-4f05-8175-a103c0f98d0b-config\") pod \"dnsmasq-dns-6bb684768f-hzq7n\" (UID: \"d0567814-352b-4f05-8175-a103c0f98d0b\") " pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.008851 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75115cba-8c6e-4c48-b71c-0277c43f446c-public-tls-certs\") pod \"keystone-6f7c9c789b-dj95d\" (UID: \"75115cba-8c6e-4c48-b71c-0277c43f446c\") " pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.009166 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75115cba-8c6e-4c48-b71c-0277c43f446c-combined-ca-bundle\") pod \"keystone-6f7c9c789b-dj95d\" (UID: \"75115cba-8c6e-4c48-b71c-0277c43f446c\") " pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.013281 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75115cba-8c6e-4c48-b71c-0277c43f446c-internal-tls-certs\") pod \"keystone-6f7c9c789b-dj95d\" (UID: \"75115cba-8c6e-4c48-b71c-0277c43f446c\") " pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.017043 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/75115cba-8c6e-4c48-b71c-0277c43f446c-fernet-keys\") pod \"keystone-6f7c9c789b-dj95d\" (UID: \"75115cba-8c6e-4c48-b71c-0277c43f446c\") " pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.019926 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75115cba-8c6e-4c48-b71c-0277c43f446c-config-data\") pod \"keystone-6f7c9c789b-dj95d\" (UID: \"75115cba-8c6e-4c48-b71c-0277c43f446c\") " pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.022667 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mcp4\" (UniqueName: \"kubernetes.io/projected/75115cba-8c6e-4c48-b71c-0277c43f446c-kube-api-access-6mcp4\") pod \"keystone-6f7c9c789b-dj95d\" (UID: \"75115cba-8c6e-4c48-b71c-0277c43f446c\") " pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.031240 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75115cba-8c6e-4c48-b71c-0277c43f446c-scripts\") pod \"keystone-6f7c9c789b-dj95d\" (UID: \"75115cba-8c6e-4c48-b71c-0277c43f446c\") " pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.034846 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hl6p\" (UniqueName: \"kubernetes.io/projected/d0567814-352b-4f05-8175-a103c0f98d0b-kube-api-access-4hl6p\") pod \"dnsmasq-dns-6bb684768f-hzq7n\" (UID: \"d0567814-352b-4f05-8175-a103c0f98d0b\") " pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.095715 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7e26dab-ee9d-414e-8447-738d9f153999-ovsdbserver-sb\") pod \"f7e26dab-ee9d-414e-8447-738d9f153999\" (UID: \"f7e26dab-ee9d-414e-8447-738d9f153999\") " Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.095808 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7e26dab-ee9d-414e-8447-738d9f153999-ovsdbserver-nb\") pod \"f7e26dab-ee9d-414e-8447-738d9f153999\" (UID: \"f7e26dab-ee9d-414e-8447-738d9f153999\") " Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.095833 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7e26dab-ee9d-414e-8447-738d9f153999-dns-svc\") pod \"f7e26dab-ee9d-414e-8447-738d9f153999\" (UID: \"f7e26dab-ee9d-414e-8447-738d9f153999\") " Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.095903 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvxdj\" (UniqueName: \"kubernetes.io/projected/f7e26dab-ee9d-414e-8447-738d9f153999-kube-api-access-vvxdj\") pod \"f7e26dab-ee9d-414e-8447-738d9f153999\" (UID: \"f7e26dab-ee9d-414e-8447-738d9f153999\") " Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.095922 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7e26dab-ee9d-414e-8447-738d9f153999-config\") pod \"f7e26dab-ee9d-414e-8447-738d9f153999\" (UID: \"f7e26dab-ee9d-414e-8447-738d9f153999\") " Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.096196 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chsqp\" (UniqueName: \"kubernetes.io/projected/58348536-72b1-4f0f-b836-6ff265673fa0-kube-api-access-chsqp\") pod \"neutron-547c4cc84d-fr8g2\" (UID: \"58348536-72b1-4f0f-b836-6ff265673fa0\") " pod="openstack/neutron-547c4cc84d-fr8g2" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.096222 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/58348536-72b1-4f0f-b836-6ff265673fa0-httpd-config\") pod \"neutron-547c4cc84d-fr8g2\" (UID: \"58348536-72b1-4f0f-b836-6ff265673fa0\") " pod="openstack/neutron-547c4cc84d-fr8g2" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.096391 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/58348536-72b1-4f0f-b836-6ff265673fa0-config\") pod \"neutron-547c4cc84d-fr8g2\" (UID: \"58348536-72b1-4f0f-b836-6ff265673fa0\") " pod="openstack/neutron-547c4cc84d-fr8g2" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.096428 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58348536-72b1-4f0f-b836-6ff265673fa0-ovndb-tls-certs\") pod \"neutron-547c4cc84d-fr8g2\" (UID: \"58348536-72b1-4f0f-b836-6ff265673fa0\") " pod="openstack/neutron-547c4cc84d-fr8g2" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.096466 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58348536-72b1-4f0f-b836-6ff265673fa0-combined-ca-bundle\") pod \"neutron-547c4cc84d-fr8g2\" (UID: \"58348536-72b1-4f0f-b836-6ff265673fa0\") " pod="openstack/neutron-547c4cc84d-fr8g2" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.100939 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58348536-72b1-4f0f-b836-6ff265673fa0-combined-ca-bundle\") pod \"neutron-547c4cc84d-fr8g2\" (UID: \"58348536-72b1-4f0f-b836-6ff265673fa0\") " pod="openstack/neutron-547c4cc84d-fr8g2" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.107820 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/58348536-72b1-4f0f-b836-6ff265673fa0-config\") pod \"neutron-547c4cc84d-fr8g2\" (UID: \"58348536-72b1-4f0f-b836-6ff265673fa0\") " pod="openstack/neutron-547c4cc84d-fr8g2" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.110951 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.126962 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58348536-72b1-4f0f-b836-6ff265673fa0-ovndb-tls-certs\") pod \"neutron-547c4cc84d-fr8g2\" (UID: \"58348536-72b1-4f0f-b836-6ff265673fa0\") " pod="openstack/neutron-547c4cc84d-fr8g2" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.129924 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/58348536-72b1-4f0f-b836-6ff265673fa0-httpd-config\") pod \"neutron-547c4cc84d-fr8g2\" (UID: \"58348536-72b1-4f0f-b836-6ff265673fa0\") " pod="openstack/neutron-547c4cc84d-fr8g2" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.140073 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chsqp\" (UniqueName: \"kubernetes.io/projected/58348536-72b1-4f0f-b836-6ff265673fa0-kube-api-access-chsqp\") pod \"neutron-547c4cc84d-fr8g2\" (UID: \"58348536-72b1-4f0f-b836-6ff265673fa0\") " pod="openstack/neutron-547c4cc84d-fr8g2" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.145070 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7e26dab-ee9d-414e-8447-738d9f153999-kube-api-access-vvxdj" (OuterVolumeSpecName: "kube-api-access-vvxdj") pod "f7e26dab-ee9d-414e-8447-738d9f153999" (UID: "f7e26dab-ee9d-414e-8447-738d9f153999"). InnerVolumeSpecName "kube-api-access-vvxdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:44:25 crc kubenswrapper[5036]: W0110 16:44:25.162589 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod731670b8_d6af_49c5_b8cf_ddeafb2462c7.slice/crio-1171fd4a69b26c35e62bcfbc05f169dbce882ad076cc25cb651265d775c4bc1f WatchSource:0}: Error finding container 1171fd4a69b26c35e62bcfbc05f169dbce882ad076cc25cb651265d775c4bc1f: Status 404 returned error can't find the container with id 1171fd4a69b26c35e62bcfbc05f169dbce882ad076cc25cb651265d775c4bc1f Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.162874 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-77cbb79454-h7btf"] Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.194134 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7e26dab-ee9d-414e-8447-738d9f153999-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f7e26dab-ee9d-414e-8447-738d9f153999" (UID: "f7e26dab-ee9d-414e-8447-738d9f153999"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.195199 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7e26dab-ee9d-414e-8447-738d9f153999-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f7e26dab-ee9d-414e-8447-738d9f153999" (UID: "f7e26dab-ee9d-414e-8447-738d9f153999"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.199916 5036 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7e26dab-ee9d-414e-8447-738d9f153999-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.200278 5036 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7e26dab-ee9d-414e-8447-738d9f153999-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.200296 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvxdj\" (UniqueName: \"kubernetes.io/projected/f7e26dab-ee9d-414e-8447-738d9f153999-kube-api-access-vvxdj\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.217505 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7e26dab-ee9d-414e-8447-738d9f153999-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f7e26dab-ee9d-414e-8447-738d9f153999" (UID: "f7e26dab-ee9d-414e-8447-738d9f153999"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.225836 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7e26dab-ee9d-414e-8447-738d9f153999-config" (OuterVolumeSpecName: "config") pod "f7e26dab-ee9d-414e-8447-738d9f153999" (UID: "f7e26dab-ee9d-414e-8447-738d9f153999"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.233391 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.261180 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-547c4cc84d-fr8g2" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.303997 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7e26dab-ee9d-414e-8447-738d9f153999-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.304054 5036 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f7e26dab-ee9d-414e-8447-738d9f153999-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.403814 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6ffbbc4bd-swcjc"] Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.409939 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-55c7665d4c-brkx9"] Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.570430 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-69f754595d-jrtgk"] Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.575806 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-gcfrt"] Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.773774 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69f754595d-jrtgk" event={"ID":"8aa25bba-4193-480c-91ab-2dd659103e99","Type":"ContainerStarted","Data":"22140576d0578e7b1e5bab75cbdfac54ec86bf9868d1df5dec99a0cd03ad99f1"} Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.776127 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-77cbb79454-h7btf" event={"ID":"731670b8-d6af-49c5-b8cf-ddeafb2462c7","Type":"ContainerStarted","Data":"1171fd4a69b26c35e62bcfbc05f169dbce882ad076cc25cb651265d775c4bc1f"} Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.778101 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-55c7665d4c-brkx9" event={"ID":"608bfa08-ff8b-4f06-bc62-e456f9e2005c","Type":"ContainerStarted","Data":"53d8a19df84fa9db1db2e6cf45a2fe6c9ba5453d5cd226a2a1a0746436c715d2"} Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.782062 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-gcfrt" event={"ID":"00621c6a-f181-4e18-a5a3-57d28c5153a2","Type":"ContainerStarted","Data":"1c3b127eef6a377a3b0ef33253a0c309f8cfab7cf2b181b58dfe4c00f5325489"} Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.784806 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" event={"ID":"f7e26dab-ee9d-414e-8447-738d9f153999","Type":"ContainerDied","Data":"63f4a5f0d91cbdac0698dc3d3f123f7f1af3963b52a1a73143ec940dd2e8f48b"} Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.784849 5036 scope.go:117] "RemoveContainer" containerID="eb5d89ff3b5a81f600b0f17c4b84f5c3af033a32a2b613964f1c3bc32fe29fdb" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.784884 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-zlbkl" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.797717 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-hzq7n"] Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.800468 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6ffbbc4bd-swcjc" event={"ID":"5b379ab6-fc59-475f-909f-4f71e7184803","Type":"ContainerStarted","Data":"46ec057467cfafd216912218e495a8cf502eb846e39ec1e1ea25cd87616c8871"} Jan 10 16:44:25 crc kubenswrapper[5036]: W0110 16:44:25.871326 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0567814_352b_4f05_8175_a103c0f98d0b.slice/crio-b31887c4a762197913f6ddce21d7ecb35ed67b9af1144ec42be6b75a2002d8a5 WatchSource:0}: Error finding container b31887c4a762197913f6ddce21d7ecb35ed67b9af1144ec42be6b75a2002d8a5: Status 404 returned error can't find the container with id b31887c4a762197913f6ddce21d7ecb35ed67b9af1144ec42be6b75a2002d8a5 Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.905422 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.905834 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.905434 5036 scope.go:117] "RemoveContainer" containerID="20d04bde7019142ba65661ac9bede99ff050f32518594f604036d4858f635e69" Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.912201 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-zlbkl"] Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.926957 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-zlbkl"] Jan 10 16:44:25 crc kubenswrapper[5036]: I0110 16:44:25.938534 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6f7c9c789b-dj95d"] Jan 10 16:44:26 crc kubenswrapper[5036]: I0110 16:44:26.099582 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-547c4cc84d-fr8g2"] Jan 10 16:44:26 crc kubenswrapper[5036]: I0110 16:44:26.533977 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7e26dab-ee9d-414e-8447-738d9f153999" path="/var/lib/kubelet/pods/f7e26dab-ee9d-414e-8447-738d9f153999/volumes" Jan 10 16:44:26 crc kubenswrapper[5036]: I0110 16:44:26.847332 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6ffbbc4bd-swcjc" event={"ID":"5b379ab6-fc59-475f-909f-4f71e7184803","Type":"ContainerStarted","Data":"541bba58137ae185118c84be881f05be5014aa9c40eb2526277fabb142a63a71"} Jan 10 16:44:26 crc kubenswrapper[5036]: I0110 16:44:26.849045 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69f754595d-jrtgk" event={"ID":"8aa25bba-4193-480c-91ab-2dd659103e99","Type":"ContainerStarted","Data":"0f91198ab17d13bdb28b67a429899a2b18f71ee6fd58b3b2788616f1b0f3eefd"} Jan 10 16:44:26 crc kubenswrapper[5036]: I0110 16:44:26.849074 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69f754595d-jrtgk" event={"ID":"8aa25bba-4193-480c-91ab-2dd659103e99","Type":"ContainerStarted","Data":"6c683e6434ac45501466f3d13add3a1d1d76e1583297689d8eb4362dd106c3db"} Jan 10 16:44:26 crc kubenswrapper[5036]: I0110 16:44:26.850463 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-69f754595d-jrtgk" Jan 10 16:44:26 crc kubenswrapper[5036]: I0110 16:44:26.850488 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-69f754595d-jrtgk" Jan 10 16:44:26 crc kubenswrapper[5036]: I0110 16:44:26.858659 5036 generic.go:334] "Generic (PLEG): container finished" podID="d0567814-352b-4f05-8175-a103c0f98d0b" containerID="f25d9f065af64361444059eed7131df34be5677f2136d35ea3559d87cc758371" exitCode=0 Jan 10 16:44:26 crc kubenswrapper[5036]: I0110 16:44:26.859065 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" event={"ID":"d0567814-352b-4f05-8175-a103c0f98d0b","Type":"ContainerDied","Data":"f25d9f065af64361444059eed7131df34be5677f2136d35ea3559d87cc758371"} Jan 10 16:44:26 crc kubenswrapper[5036]: I0110 16:44:26.859116 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" event={"ID":"d0567814-352b-4f05-8175-a103c0f98d0b","Type":"ContainerStarted","Data":"b31887c4a762197913f6ddce21d7ecb35ed67b9af1144ec42be6b75a2002d8a5"} Jan 10 16:44:26 crc kubenswrapper[5036]: I0110 16:44:26.863267 5036 generic.go:334] "Generic (PLEG): container finished" podID="00621c6a-f181-4e18-a5a3-57d28c5153a2" containerID="e5529d227e26959453f7e6616bce0f222b7282e043e82ab3354c3c2742a67bf6" exitCode=0 Jan 10 16:44:26 crc kubenswrapper[5036]: I0110 16:44:26.863323 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-gcfrt" event={"ID":"00621c6a-f181-4e18-a5a3-57d28c5153a2","Type":"ContainerDied","Data":"e5529d227e26959453f7e6616bce0f222b7282e043e82ab3354c3c2742a67bf6"} Jan 10 16:44:26 crc kubenswrapper[5036]: I0110 16:44:26.882848 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-547c4cc84d-fr8g2" event={"ID":"58348536-72b1-4f0f-b836-6ff265673fa0","Type":"ContainerStarted","Data":"5bc70539aa69aa7023f653e028042c3ee40d087037fd2fe6d281659ed7a45afa"} Jan 10 16:44:26 crc kubenswrapper[5036]: I0110 16:44:26.885381 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-69f754595d-jrtgk" podStartSLOduration=2.8853609479999998 podStartE2EDuration="2.885360948s" podCreationTimestamp="2026-01-10 16:44:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:44:26.883477815 +0000 UTC m=+988.753713319" watchObservedRunningTime="2026-01-10 16:44:26.885360948 +0000 UTC m=+988.755596442" Jan 10 16:44:26 crc kubenswrapper[5036]: I0110 16:44:26.902614 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6f7c9c789b-dj95d" event={"ID":"75115cba-8c6e-4c48-b71c-0277c43f446c","Type":"ContainerStarted","Data":"9a87ee16fb027c51e5d7730f12084ddcb832b98dacb9af560838b76506f473dc"} Jan 10 16:44:26 crc kubenswrapper[5036]: I0110 16:44:26.902657 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6f7c9c789b-dj95d" event={"ID":"75115cba-8c6e-4c48-b71c-0277c43f446c","Type":"ContainerStarted","Data":"7d82570f8f163897031847a2973ce6afce60d0ae57ea124e477c1bf0ee477875"} Jan 10 16:44:26 crc kubenswrapper[5036]: I0110 16:44:26.902927 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:26 crc kubenswrapper[5036]: I0110 16:44:26.967998 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6f7c9c789b-dj95d" podStartSLOduration=2.967982035 podStartE2EDuration="2.967982035s" podCreationTimestamp="2026-01-10 16:44:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:44:26.966873154 +0000 UTC m=+988.837108658" watchObservedRunningTime="2026-01-10 16:44:26.967982035 +0000 UTC m=+988.838217529" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.462135 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699df9757c-gcfrt" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.467610 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-74d5fd97c9-96pjx"] Jan 10 16:44:28 crc kubenswrapper[5036]: E0110 16:44:28.489075 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7e26dab-ee9d-414e-8447-738d9f153999" containerName="init" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.489119 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7e26dab-ee9d-414e-8447-738d9f153999" containerName="init" Jan 10 16:44:28 crc kubenswrapper[5036]: E0110 16:44:28.489153 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7e26dab-ee9d-414e-8447-738d9f153999" containerName="dnsmasq-dns" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.489162 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7e26dab-ee9d-414e-8447-738d9f153999" containerName="dnsmasq-dns" Jan 10 16:44:28 crc kubenswrapper[5036]: E0110 16:44:28.489182 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00621c6a-f181-4e18-a5a3-57d28c5153a2" containerName="init" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.489190 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="00621c6a-f181-4e18-a5a3-57d28c5153a2" containerName="init" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.489443 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7e26dab-ee9d-414e-8447-738d9f153999" containerName="dnsmasq-dns" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.489456 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="00621c6a-f181-4e18-a5a3-57d28c5153a2" containerName="init" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.490348 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.492810 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.494183 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.503822 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-74d5fd97c9-96pjx"] Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.595116 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00621c6a-f181-4e18-a5a3-57d28c5153a2-config\") pod \"00621c6a-f181-4e18-a5a3-57d28c5153a2\" (UID: \"00621c6a-f181-4e18-a5a3-57d28c5153a2\") " Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.595815 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00621c6a-f181-4e18-a5a3-57d28c5153a2-ovsdbserver-sb\") pod \"00621c6a-f181-4e18-a5a3-57d28c5153a2\" (UID: \"00621c6a-f181-4e18-a5a3-57d28c5153a2\") " Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.596026 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dmgt\" (UniqueName: \"kubernetes.io/projected/00621c6a-f181-4e18-a5a3-57d28c5153a2-kube-api-access-8dmgt\") pod \"00621c6a-f181-4e18-a5a3-57d28c5153a2\" (UID: \"00621c6a-f181-4e18-a5a3-57d28c5153a2\") " Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.596277 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00621c6a-f181-4e18-a5a3-57d28c5153a2-dns-svc\") pod \"00621c6a-f181-4e18-a5a3-57d28c5153a2\" (UID: \"00621c6a-f181-4e18-a5a3-57d28c5153a2\") " Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.596425 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00621c6a-f181-4e18-a5a3-57d28c5153a2-ovsdbserver-nb\") pod \"00621c6a-f181-4e18-a5a3-57d28c5153a2\" (UID: \"00621c6a-f181-4e18-a5a3-57d28c5153a2\") " Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.596975 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6jhr\" (UniqueName: \"kubernetes.io/projected/ffb9a3a8-bbeb-414f-8d26-f35e51a05957-kube-api-access-p6jhr\") pod \"neutron-74d5fd97c9-96pjx\" (UID: \"ffb9a3a8-bbeb-414f-8d26-f35e51a05957\") " pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.597553 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffb9a3a8-bbeb-414f-8d26-f35e51a05957-ovndb-tls-certs\") pod \"neutron-74d5fd97c9-96pjx\" (UID: \"ffb9a3a8-bbeb-414f-8d26-f35e51a05957\") " pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.597859 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb9a3a8-bbeb-414f-8d26-f35e51a05957-combined-ca-bundle\") pod \"neutron-74d5fd97c9-96pjx\" (UID: \"ffb9a3a8-bbeb-414f-8d26-f35e51a05957\") " pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.598083 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ffb9a3a8-bbeb-414f-8d26-f35e51a05957-config\") pod \"neutron-74d5fd97c9-96pjx\" (UID: \"ffb9a3a8-bbeb-414f-8d26-f35e51a05957\") " pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.598280 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ffb9a3a8-bbeb-414f-8d26-f35e51a05957-httpd-config\") pod \"neutron-74d5fd97c9-96pjx\" (UID: \"ffb9a3a8-bbeb-414f-8d26-f35e51a05957\") " pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.598383 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffb9a3a8-bbeb-414f-8d26-f35e51a05957-public-tls-certs\") pod \"neutron-74d5fd97c9-96pjx\" (UID: \"ffb9a3a8-bbeb-414f-8d26-f35e51a05957\") " pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.598596 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffb9a3a8-bbeb-414f-8d26-f35e51a05957-internal-tls-certs\") pod \"neutron-74d5fd97c9-96pjx\" (UID: \"ffb9a3a8-bbeb-414f-8d26-f35e51a05957\") " pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.608569 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00621c6a-f181-4e18-a5a3-57d28c5153a2-kube-api-access-8dmgt" (OuterVolumeSpecName: "kube-api-access-8dmgt") pod "00621c6a-f181-4e18-a5a3-57d28c5153a2" (UID: "00621c6a-f181-4e18-a5a3-57d28c5153a2"). InnerVolumeSpecName "kube-api-access-8dmgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.619253 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00621c6a-f181-4e18-a5a3-57d28c5153a2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "00621c6a-f181-4e18-a5a3-57d28c5153a2" (UID: "00621c6a-f181-4e18-a5a3-57d28c5153a2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.624470 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00621c6a-f181-4e18-a5a3-57d28c5153a2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "00621c6a-f181-4e18-a5a3-57d28c5153a2" (UID: "00621c6a-f181-4e18-a5a3-57d28c5153a2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.639181 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00621c6a-f181-4e18-a5a3-57d28c5153a2-config" (OuterVolumeSpecName: "config") pod "00621c6a-f181-4e18-a5a3-57d28c5153a2" (UID: "00621c6a-f181-4e18-a5a3-57d28c5153a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.647098 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00621c6a-f181-4e18-a5a3-57d28c5153a2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "00621c6a-f181-4e18-a5a3-57d28c5153a2" (UID: "00621c6a-f181-4e18-a5a3-57d28c5153a2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.700769 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffb9a3a8-bbeb-414f-8d26-f35e51a05957-ovndb-tls-certs\") pod \"neutron-74d5fd97c9-96pjx\" (UID: \"ffb9a3a8-bbeb-414f-8d26-f35e51a05957\") " pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.700844 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb9a3a8-bbeb-414f-8d26-f35e51a05957-combined-ca-bundle\") pod \"neutron-74d5fd97c9-96pjx\" (UID: \"ffb9a3a8-bbeb-414f-8d26-f35e51a05957\") " pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.700907 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ffb9a3a8-bbeb-414f-8d26-f35e51a05957-config\") pod \"neutron-74d5fd97c9-96pjx\" (UID: \"ffb9a3a8-bbeb-414f-8d26-f35e51a05957\") " pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.700923 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffb9a3a8-bbeb-414f-8d26-f35e51a05957-public-tls-certs\") pod \"neutron-74d5fd97c9-96pjx\" (UID: \"ffb9a3a8-bbeb-414f-8d26-f35e51a05957\") " pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.700941 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ffb9a3a8-bbeb-414f-8d26-f35e51a05957-httpd-config\") pod \"neutron-74d5fd97c9-96pjx\" (UID: \"ffb9a3a8-bbeb-414f-8d26-f35e51a05957\") " pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.700962 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffb9a3a8-bbeb-414f-8d26-f35e51a05957-internal-tls-certs\") pod \"neutron-74d5fd97c9-96pjx\" (UID: \"ffb9a3a8-bbeb-414f-8d26-f35e51a05957\") " pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.701050 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6jhr\" (UniqueName: \"kubernetes.io/projected/ffb9a3a8-bbeb-414f-8d26-f35e51a05957-kube-api-access-p6jhr\") pod \"neutron-74d5fd97c9-96pjx\" (UID: \"ffb9a3a8-bbeb-414f-8d26-f35e51a05957\") " pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.701113 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00621c6a-f181-4e18-a5a3-57d28c5153a2-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.701123 5036 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00621c6a-f181-4e18-a5a3-57d28c5153a2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.701133 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dmgt\" (UniqueName: \"kubernetes.io/projected/00621c6a-f181-4e18-a5a3-57d28c5153a2-kube-api-access-8dmgt\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.701143 5036 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00621c6a-f181-4e18-a5a3-57d28c5153a2-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.701151 5036 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00621c6a-f181-4e18-a5a3-57d28c5153a2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.704944 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffb9a3a8-bbeb-414f-8d26-f35e51a05957-ovndb-tls-certs\") pod \"neutron-74d5fd97c9-96pjx\" (UID: \"ffb9a3a8-bbeb-414f-8d26-f35e51a05957\") " pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.706364 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffb9a3a8-bbeb-414f-8d26-f35e51a05957-internal-tls-certs\") pod \"neutron-74d5fd97c9-96pjx\" (UID: \"ffb9a3a8-bbeb-414f-8d26-f35e51a05957\") " pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.715291 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffb9a3a8-bbeb-414f-8d26-f35e51a05957-public-tls-certs\") pod \"neutron-74d5fd97c9-96pjx\" (UID: \"ffb9a3a8-bbeb-414f-8d26-f35e51a05957\") " pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.719215 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb9a3a8-bbeb-414f-8d26-f35e51a05957-combined-ca-bundle\") pod \"neutron-74d5fd97c9-96pjx\" (UID: \"ffb9a3a8-bbeb-414f-8d26-f35e51a05957\") " pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.719372 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ffb9a3a8-bbeb-414f-8d26-f35e51a05957-config\") pod \"neutron-74d5fd97c9-96pjx\" (UID: \"ffb9a3a8-bbeb-414f-8d26-f35e51a05957\") " pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.719979 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ffb9a3a8-bbeb-414f-8d26-f35e51a05957-httpd-config\") pod \"neutron-74d5fd97c9-96pjx\" (UID: \"ffb9a3a8-bbeb-414f-8d26-f35e51a05957\") " pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.723166 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6jhr\" (UniqueName: \"kubernetes.io/projected/ffb9a3a8-bbeb-414f-8d26-f35e51a05957-kube-api-access-p6jhr\") pod \"neutron-74d5fd97c9-96pjx\" (UID: \"ffb9a3a8-bbeb-414f-8d26-f35e51a05957\") " pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.813042 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.919033 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-gcfrt" event={"ID":"00621c6a-f181-4e18-a5a3-57d28c5153a2","Type":"ContainerDied","Data":"1c3b127eef6a377a3b0ef33253a0c309f8cfab7cf2b181b58dfe4c00f5325489"} Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.919113 5036 scope.go:117] "RemoveContainer" containerID="e5529d227e26959453f7e6616bce0f222b7282e043e82ab3354c3c2742a67bf6" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.919049 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699df9757c-gcfrt" Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.983265 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-gcfrt"] Jan 10 16:44:28 crc kubenswrapper[5036]: I0110 16:44:28.993423 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-gcfrt"] Jan 10 16:44:30 crc kubenswrapper[5036]: I0110 16:44:30.525636 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00621c6a-f181-4e18-a5a3-57d28c5153a2" path="/var/lib/kubelet/pods/00621c6a-f181-4e18-a5a3-57d28c5153a2/volumes" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.051649 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7655587964-dzfxf"] Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.055675 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.057656 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.060030 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.077172 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7655587964-dzfxf"] Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.164758 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a96677c4-c2f0-4fba-bcb0-a657dfdd1f41-config-data\") pod \"barbican-api-7655587964-dzfxf\" (UID: \"a96677c4-c2f0-4fba-bcb0-a657dfdd1f41\") " pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.164808 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a96677c4-c2f0-4fba-bcb0-a657dfdd1f41-combined-ca-bundle\") pod \"barbican-api-7655587964-dzfxf\" (UID: \"a96677c4-c2f0-4fba-bcb0-a657dfdd1f41\") " pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.164833 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a96677c4-c2f0-4fba-bcb0-a657dfdd1f41-public-tls-certs\") pod \"barbican-api-7655587964-dzfxf\" (UID: \"a96677c4-c2f0-4fba-bcb0-a657dfdd1f41\") " pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.164880 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a96677c4-c2f0-4fba-bcb0-a657dfdd1f41-internal-tls-certs\") pod \"barbican-api-7655587964-dzfxf\" (UID: \"a96677c4-c2f0-4fba-bcb0-a657dfdd1f41\") " pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.164899 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5dqj\" (UniqueName: \"kubernetes.io/projected/a96677c4-c2f0-4fba-bcb0-a657dfdd1f41-kube-api-access-g5dqj\") pod \"barbican-api-7655587964-dzfxf\" (UID: \"a96677c4-c2f0-4fba-bcb0-a657dfdd1f41\") " pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.165348 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a96677c4-c2f0-4fba-bcb0-a657dfdd1f41-logs\") pod \"barbican-api-7655587964-dzfxf\" (UID: \"a96677c4-c2f0-4fba-bcb0-a657dfdd1f41\") " pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.165409 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a96677c4-c2f0-4fba-bcb0-a657dfdd1f41-config-data-custom\") pod \"barbican-api-7655587964-dzfxf\" (UID: \"a96677c4-c2f0-4fba-bcb0-a657dfdd1f41\") " pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.267208 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a96677c4-c2f0-4fba-bcb0-a657dfdd1f41-internal-tls-certs\") pod \"barbican-api-7655587964-dzfxf\" (UID: \"a96677c4-c2f0-4fba-bcb0-a657dfdd1f41\") " pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.267259 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5dqj\" (UniqueName: \"kubernetes.io/projected/a96677c4-c2f0-4fba-bcb0-a657dfdd1f41-kube-api-access-g5dqj\") pod \"barbican-api-7655587964-dzfxf\" (UID: \"a96677c4-c2f0-4fba-bcb0-a657dfdd1f41\") " pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.267302 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a96677c4-c2f0-4fba-bcb0-a657dfdd1f41-logs\") pod \"barbican-api-7655587964-dzfxf\" (UID: \"a96677c4-c2f0-4fba-bcb0-a657dfdd1f41\") " pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.267348 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a96677c4-c2f0-4fba-bcb0-a657dfdd1f41-config-data-custom\") pod \"barbican-api-7655587964-dzfxf\" (UID: \"a96677c4-c2f0-4fba-bcb0-a657dfdd1f41\") " pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.267378 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a96677c4-c2f0-4fba-bcb0-a657dfdd1f41-config-data\") pod \"barbican-api-7655587964-dzfxf\" (UID: \"a96677c4-c2f0-4fba-bcb0-a657dfdd1f41\") " pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.267407 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a96677c4-c2f0-4fba-bcb0-a657dfdd1f41-combined-ca-bundle\") pod \"barbican-api-7655587964-dzfxf\" (UID: \"a96677c4-c2f0-4fba-bcb0-a657dfdd1f41\") " pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.267425 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a96677c4-c2f0-4fba-bcb0-a657dfdd1f41-public-tls-certs\") pod \"barbican-api-7655587964-dzfxf\" (UID: \"a96677c4-c2f0-4fba-bcb0-a657dfdd1f41\") " pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.268223 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a96677c4-c2f0-4fba-bcb0-a657dfdd1f41-logs\") pod \"barbican-api-7655587964-dzfxf\" (UID: \"a96677c4-c2f0-4fba-bcb0-a657dfdd1f41\") " pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.273974 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a96677c4-c2f0-4fba-bcb0-a657dfdd1f41-combined-ca-bundle\") pod \"barbican-api-7655587964-dzfxf\" (UID: \"a96677c4-c2f0-4fba-bcb0-a657dfdd1f41\") " pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.284026 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a96677c4-c2f0-4fba-bcb0-a657dfdd1f41-internal-tls-certs\") pod \"barbican-api-7655587964-dzfxf\" (UID: \"a96677c4-c2f0-4fba-bcb0-a657dfdd1f41\") " pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.290220 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a96677c4-c2f0-4fba-bcb0-a657dfdd1f41-public-tls-certs\") pod \"barbican-api-7655587964-dzfxf\" (UID: \"a96677c4-c2f0-4fba-bcb0-a657dfdd1f41\") " pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.293099 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a96677c4-c2f0-4fba-bcb0-a657dfdd1f41-config-data\") pod \"barbican-api-7655587964-dzfxf\" (UID: \"a96677c4-c2f0-4fba-bcb0-a657dfdd1f41\") " pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.293312 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a96677c4-c2f0-4fba-bcb0-a657dfdd1f41-config-data-custom\") pod \"barbican-api-7655587964-dzfxf\" (UID: \"a96677c4-c2f0-4fba-bcb0-a657dfdd1f41\") " pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.308651 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5dqj\" (UniqueName: \"kubernetes.io/projected/a96677c4-c2f0-4fba-bcb0-a657dfdd1f41-kube-api-access-g5dqj\") pod \"barbican-api-7655587964-dzfxf\" (UID: \"a96677c4-c2f0-4fba-bcb0-a657dfdd1f41\") " pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.377003 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.500607 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-74d5fd97c9-96pjx"] Jan 10 16:44:31 crc kubenswrapper[5036]: I0110 16:44:31.858389 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7655587964-dzfxf"] Jan 10 16:44:32 crc kubenswrapper[5036]: I0110 16:44:32.025538 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-547c4cc84d-fr8g2" event={"ID":"58348536-72b1-4f0f-b836-6ff265673fa0","Type":"ContainerStarted","Data":"9c50d3496b2756f1c362e47205a150378df5f2ba11fe1c0cd887ab6068044258"} Jan 10 16:44:32 crc kubenswrapper[5036]: I0110 16:44:32.027319 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6ffbbc4bd-swcjc" event={"ID":"5b379ab6-fc59-475f-909f-4f71e7184803","Type":"ContainerStarted","Data":"403cbd9fa9944f9775930c251fd31c666c88124e50e4a42946cb400d74e2d4a4"} Jan 10 16:44:32 crc kubenswrapper[5036]: I0110 16:44:32.027726 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:32 crc kubenswrapper[5036]: I0110 16:44:32.031848 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" event={"ID":"d0567814-352b-4f05-8175-a103c0f98d0b","Type":"ContainerStarted","Data":"efcc472b386dfdcb34147901510f490a00d52d9f868ac42d620fe9dcdc7f83ae"} Jan 10 16:44:32 crc kubenswrapper[5036]: I0110 16:44:32.032316 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" Jan 10 16:44:32 crc kubenswrapper[5036]: I0110 16:44:32.033934 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74d5fd97c9-96pjx" event={"ID":"ffb9a3a8-bbeb-414f-8d26-f35e51a05957","Type":"ContainerStarted","Data":"e3531107278130a8faa2f2c53984f9beb06bb47b46bd17ecfc4425f2fa3c350e"} Jan 10 16:44:32 crc kubenswrapper[5036]: I0110 16:44:32.033964 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74d5fd97c9-96pjx" event={"ID":"ffb9a3a8-bbeb-414f-8d26-f35e51a05957","Type":"ContainerStarted","Data":"03fbf1f3b44cc5128964b3dc70f538e2d735066d55217954f416267fd88f7577"} Jan 10 16:44:32 crc kubenswrapper[5036]: I0110 16:44:32.356583 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6ffbbc4bd-swcjc" podStartSLOduration=9.356560282 podStartE2EDuration="9.356560282s" podCreationTimestamp="2026-01-10 16:44:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:44:32.344343975 +0000 UTC m=+994.214579469" watchObservedRunningTime="2026-01-10 16:44:32.356560282 +0000 UTC m=+994.226795776" Jan 10 16:44:32 crc kubenswrapper[5036]: I0110 16:44:32.378828 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" podStartSLOduration=8.378809364 podStartE2EDuration="8.378809364s" podCreationTimestamp="2026-01-10 16:44:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:44:32.374719358 +0000 UTC m=+994.244954862" watchObservedRunningTime="2026-01-10 16:44:32.378809364 +0000 UTC m=+994.249044858" Jan 10 16:44:33 crc kubenswrapper[5036]: I0110 16:44:33.054308 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7655587964-dzfxf" event={"ID":"a96677c4-c2f0-4fba-bcb0-a657dfdd1f41","Type":"ContainerStarted","Data":"8c2c50eef6735efe1305e921eab4566b30e2e8b8d51b2cff198795ded0346a71"} Jan 10 16:44:33 crc kubenswrapper[5036]: I0110 16:44:33.054664 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:33 crc kubenswrapper[5036]: I0110 16:44:33.865267 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:34 crc kubenswrapper[5036]: I0110 16:44:34.065565 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-547c4cc84d-fr8g2" event={"ID":"58348536-72b1-4f0f-b836-6ff265673fa0","Type":"ContainerStarted","Data":"d7c2f571918cef4b3224e7237afd1f22cdfacf8af387df5e4f12b31241286bc2"} Jan 10 16:44:34 crc kubenswrapper[5036]: I0110 16:44:34.065625 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-547c4cc84d-fr8g2" Jan 10 16:44:34 crc kubenswrapper[5036]: I0110 16:44:34.092265 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-547c4cc84d-fr8g2" podStartSLOduration=10.092244733 podStartE2EDuration="10.092244733s" podCreationTimestamp="2026-01-10 16:44:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:44:34.09211732 +0000 UTC m=+995.962352814" watchObservedRunningTime="2026-01-10 16:44:34.092244733 +0000 UTC m=+995.962480227" Jan 10 16:44:36 crc kubenswrapper[5036]: I0110 16:44:36.122550 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-69f754595d-jrtgk" Jan 10 16:44:36 crc kubenswrapper[5036]: I0110 16:44:36.190650 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-69f754595d-jrtgk" Jan 10 16:44:37 crc kubenswrapper[5036]: E0110 16:44:37.992622 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="acd18657-f02e-4b2f-8ec6-e46b2408e720" Jan 10 16:44:38 crc kubenswrapper[5036]: I0110 16:44:38.103267 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7655587964-dzfxf" event={"ID":"a96677c4-c2f0-4fba-bcb0-a657dfdd1f41","Type":"ContainerStarted","Data":"91cf8d18f27517315f5010e8c181594f4fbbd13cb5cf7fab497b06d54b3ee9a3"} Jan 10 16:44:38 crc kubenswrapper[5036]: I0110 16:44:38.103321 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7655587964-dzfxf" event={"ID":"a96677c4-c2f0-4fba-bcb0-a657dfdd1f41","Type":"ContainerStarted","Data":"f19d473cbe7381197077bb156a784309cf566f07c0f239ab9a25d08bb8a08868"} Jan 10 16:44:38 crc kubenswrapper[5036]: I0110 16:44:38.104252 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:38 crc kubenswrapper[5036]: I0110 16:44:38.104283 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:38 crc kubenswrapper[5036]: I0110 16:44:38.108608 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-77cbb79454-h7btf" event={"ID":"731670b8-d6af-49c5-b8cf-ddeafb2462c7","Type":"ContainerStarted","Data":"1cacdb89897a5c792836a403f90fe0f34635150199843926157c21d44a8256ae"} Jan 10 16:44:38 crc kubenswrapper[5036]: I0110 16:44:38.111199 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74d5fd97c9-96pjx" event={"ID":"ffb9a3a8-bbeb-414f-8d26-f35e51a05957","Type":"ContainerStarted","Data":"662a9ce15e781f05bf22469fa7531264acf1138152a28cbf5a8b64c8ba3a073e"} Jan 10 16:44:38 crc kubenswrapper[5036]: I0110 16:44:38.112118 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:38 crc kubenswrapper[5036]: I0110 16:44:38.134158 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7655587964-dzfxf" podStartSLOduration=7.134136138 podStartE2EDuration="7.134136138s" podCreationTimestamp="2026-01-10 16:44:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:44:38.122624551 +0000 UTC m=+999.992860045" watchObservedRunningTime="2026-01-10 16:44:38.134136138 +0000 UTC m=+1000.004371632" Jan 10 16:44:38 crc kubenswrapper[5036]: I0110 16:44:38.151165 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-74d5fd97c9-96pjx" podStartSLOduration=10.151139021 podStartE2EDuration="10.151139021s" podCreationTimestamp="2026-01-10 16:44:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:44:38.145366087 +0000 UTC m=+1000.015601571" watchObservedRunningTime="2026-01-10 16:44:38.151139021 +0000 UTC m=+1000.021374515" Jan 10 16:44:38 crc kubenswrapper[5036]: I0110 16:44:38.159270 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acd18657-f02e-4b2f-8ec6-e46b2408e720","Type":"ContainerStarted","Data":"b8a92dd2d8aa3182e7e1ad6e40759affdf7b105a449c5e945540c99880621d35"} Jan 10 16:44:38 crc kubenswrapper[5036]: I0110 16:44:38.159430 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="acd18657-f02e-4b2f-8ec6-e46b2408e720" containerName="ceilometer-notification-agent" containerID="cri-o://70616488c1da8e0d8e5fd028915cba1cbe92da1ad3858c776c7fa631fb9855dc" gracePeriod=30 Jan 10 16:44:38 crc kubenswrapper[5036]: I0110 16:44:38.159474 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 10 16:44:38 crc kubenswrapper[5036]: I0110 16:44:38.159516 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="acd18657-f02e-4b2f-8ec6-e46b2408e720" containerName="proxy-httpd" containerID="cri-o://b8a92dd2d8aa3182e7e1ad6e40759affdf7b105a449c5e945540c99880621d35" gracePeriod=30 Jan 10 16:44:38 crc kubenswrapper[5036]: I0110 16:44:38.159551 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="acd18657-f02e-4b2f-8ec6-e46b2408e720" containerName="sg-core" containerID="cri-o://df47d644831e788ecfaf9ef87a2766c5b00ea6efd48a54d1874c6dfa8fa44abd" gracePeriod=30 Jan 10 16:44:38 crc kubenswrapper[5036]: I0110 16:44:38.167949 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-55c7665d4c-brkx9" event={"ID":"608bfa08-ff8b-4f06-bc62-e456f9e2005c","Type":"ContainerStarted","Data":"a40f10fdacf51b71c1e19655271de4dc5ebac3cca3fcd82587f7280e562b2cca"} Jan 10 16:44:39 crc kubenswrapper[5036]: I0110 16:44:39.197797 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-77cbb79454-h7btf" event={"ID":"731670b8-d6af-49c5-b8cf-ddeafb2462c7","Type":"ContainerStarted","Data":"d84214e72a076bc4e90f629ab49f916316b633c50efc5811e33b86d3dedc70b7"} Jan 10 16:44:39 crc kubenswrapper[5036]: I0110 16:44:39.201669 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-j9crs" event={"ID":"6fe6dd46-603d-4595-ad27-32f98623fbcc","Type":"ContainerStarted","Data":"cbfd17d699134a608165d5a7c768f8f1ef9c076a375f09a1733cc2038619dbbb"} Jan 10 16:44:39 crc kubenswrapper[5036]: I0110 16:44:39.206027 5036 generic.go:334] "Generic (PLEG): container finished" podID="acd18657-f02e-4b2f-8ec6-e46b2408e720" containerID="b8a92dd2d8aa3182e7e1ad6e40759affdf7b105a449c5e945540c99880621d35" exitCode=0 Jan 10 16:44:39 crc kubenswrapper[5036]: I0110 16:44:39.206063 5036 generic.go:334] "Generic (PLEG): container finished" podID="acd18657-f02e-4b2f-8ec6-e46b2408e720" containerID="df47d644831e788ecfaf9ef87a2766c5b00ea6efd48a54d1874c6dfa8fa44abd" exitCode=2 Jan 10 16:44:39 crc kubenswrapper[5036]: I0110 16:44:39.206114 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acd18657-f02e-4b2f-8ec6-e46b2408e720","Type":"ContainerDied","Data":"b8a92dd2d8aa3182e7e1ad6e40759affdf7b105a449c5e945540c99880621d35"} Jan 10 16:44:39 crc kubenswrapper[5036]: I0110 16:44:39.206143 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acd18657-f02e-4b2f-8ec6-e46b2408e720","Type":"ContainerDied","Data":"df47d644831e788ecfaf9ef87a2766c5b00ea6efd48a54d1874c6dfa8fa44abd"} Jan 10 16:44:39 crc kubenswrapper[5036]: I0110 16:44:39.208317 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-55c7665d4c-brkx9" event={"ID":"608bfa08-ff8b-4f06-bc62-e456f9e2005c","Type":"ContainerStarted","Data":"d2ad6ff67c4761fac69dcf51e00df3c39e526814bc506cf952327486e71091ba"} Jan 10 16:44:39 crc kubenswrapper[5036]: I0110 16:44:39.240206 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-77cbb79454-h7btf" podStartSLOduration=3.884980986 podStartE2EDuration="16.240165514s" podCreationTimestamp="2026-01-10 16:44:23 +0000 UTC" firstStartedPulling="2026-01-10 16:44:25.165298471 +0000 UTC m=+987.035533965" lastFinishedPulling="2026-01-10 16:44:37.520482999 +0000 UTC m=+999.390718493" observedRunningTime="2026-01-10 16:44:39.22240129 +0000 UTC m=+1001.092636804" watchObservedRunningTime="2026-01-10 16:44:39.240165514 +0000 UTC m=+1001.110401008" Jan 10 16:44:39 crc kubenswrapper[5036]: I0110 16:44:39.273254 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-j9crs" podStartSLOduration=3.4558718649999998 podStartE2EDuration="50.273232944s" podCreationTimestamp="2026-01-10 16:43:49 +0000 UTC" firstStartedPulling="2026-01-10 16:43:50.814566645 +0000 UTC m=+952.684802139" lastFinishedPulling="2026-01-10 16:44:37.631927704 +0000 UTC m=+999.502163218" observedRunningTime="2026-01-10 16:44:39.243581301 +0000 UTC m=+1001.113816825" watchObservedRunningTime="2026-01-10 16:44:39.273232944 +0000 UTC m=+1001.143468438" Jan 10 16:44:39 crc kubenswrapper[5036]: I0110 16:44:39.287008 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-55c7665d4c-brkx9" podStartSLOduration=4.303391221 podStartE2EDuration="16.286988414s" podCreationTimestamp="2026-01-10 16:44:23 +0000 UTC" firstStartedPulling="2026-01-10 16:44:25.459786546 +0000 UTC m=+987.330022040" lastFinishedPulling="2026-01-10 16:44:37.443383739 +0000 UTC m=+999.313619233" observedRunningTime="2026-01-10 16:44:39.263898568 +0000 UTC m=+1001.134134062" watchObservedRunningTime="2026-01-10 16:44:39.286988414 +0000 UTC m=+1001.157223908" Jan 10 16:44:39 crc kubenswrapper[5036]: I0110 16:44:39.955084 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.024914 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/acd18657-f02e-4b2f-8ec6-e46b2408e720-sg-core-conf-yaml\") pod \"acd18657-f02e-4b2f-8ec6-e46b2408e720\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.024981 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfdn5\" (UniqueName: \"kubernetes.io/projected/acd18657-f02e-4b2f-8ec6-e46b2408e720-kube-api-access-mfdn5\") pod \"acd18657-f02e-4b2f-8ec6-e46b2408e720\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.025054 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acd18657-f02e-4b2f-8ec6-e46b2408e720-log-httpd\") pod \"acd18657-f02e-4b2f-8ec6-e46b2408e720\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.025116 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd18657-f02e-4b2f-8ec6-e46b2408e720-config-data\") pod \"acd18657-f02e-4b2f-8ec6-e46b2408e720\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.025200 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acd18657-f02e-4b2f-8ec6-e46b2408e720-scripts\") pod \"acd18657-f02e-4b2f-8ec6-e46b2408e720\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.025870 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acd18657-f02e-4b2f-8ec6-e46b2408e720-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "acd18657-f02e-4b2f-8ec6-e46b2408e720" (UID: "acd18657-f02e-4b2f-8ec6-e46b2408e720"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.026018 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acd18657-f02e-4b2f-8ec6-e46b2408e720-run-httpd\") pod \"acd18657-f02e-4b2f-8ec6-e46b2408e720\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.026095 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd18657-f02e-4b2f-8ec6-e46b2408e720-combined-ca-bundle\") pod \"acd18657-f02e-4b2f-8ec6-e46b2408e720\" (UID: \"acd18657-f02e-4b2f-8ec6-e46b2408e720\") " Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.026280 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acd18657-f02e-4b2f-8ec6-e46b2408e720-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "acd18657-f02e-4b2f-8ec6-e46b2408e720" (UID: "acd18657-f02e-4b2f-8ec6-e46b2408e720"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.026764 5036 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acd18657-f02e-4b2f-8ec6-e46b2408e720-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.026781 5036 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acd18657-f02e-4b2f-8ec6-e46b2408e720-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.031434 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd18657-f02e-4b2f-8ec6-e46b2408e720-scripts" (OuterVolumeSpecName: "scripts") pod "acd18657-f02e-4b2f-8ec6-e46b2408e720" (UID: "acd18657-f02e-4b2f-8ec6-e46b2408e720"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.042326 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acd18657-f02e-4b2f-8ec6-e46b2408e720-kube-api-access-mfdn5" (OuterVolumeSpecName: "kube-api-access-mfdn5") pod "acd18657-f02e-4b2f-8ec6-e46b2408e720" (UID: "acd18657-f02e-4b2f-8ec6-e46b2408e720"). InnerVolumeSpecName "kube-api-access-mfdn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.054773 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd18657-f02e-4b2f-8ec6-e46b2408e720-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "acd18657-f02e-4b2f-8ec6-e46b2408e720" (UID: "acd18657-f02e-4b2f-8ec6-e46b2408e720"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.097972 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd18657-f02e-4b2f-8ec6-e46b2408e720-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "acd18657-f02e-4b2f-8ec6-e46b2408e720" (UID: "acd18657-f02e-4b2f-8ec6-e46b2408e720"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.111368 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd18657-f02e-4b2f-8ec6-e46b2408e720-config-data" (OuterVolumeSpecName: "config-data") pod "acd18657-f02e-4b2f-8ec6-e46b2408e720" (UID: "acd18657-f02e-4b2f-8ec6-e46b2408e720"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.114008 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.128747 5036 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/acd18657-f02e-4b2f-8ec6-e46b2408e720-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.128785 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfdn5\" (UniqueName: \"kubernetes.io/projected/acd18657-f02e-4b2f-8ec6-e46b2408e720-kube-api-access-mfdn5\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.128803 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd18657-f02e-4b2f-8ec6-e46b2408e720-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.128816 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acd18657-f02e-4b2f-8ec6-e46b2408e720-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.128830 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd18657-f02e-4b2f-8ec6-e46b2408e720-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.198548 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-vrp42"] Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.198916 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" podUID="7789491d-15dc-44e0-88d9-141ba4009010" containerName="dnsmasq-dns" containerID="cri-o://4c9a4f26c5c3c2fff3f468f9e247fe597705dd13b66d99a167ea534c903e25c4" gracePeriod=10 Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.257344 5036 generic.go:334] "Generic (PLEG): container finished" podID="acd18657-f02e-4b2f-8ec6-e46b2408e720" containerID="70616488c1da8e0d8e5fd028915cba1cbe92da1ad3858c776c7fa631fb9855dc" exitCode=0 Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.258289 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.262101 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acd18657-f02e-4b2f-8ec6-e46b2408e720","Type":"ContainerDied","Data":"70616488c1da8e0d8e5fd028915cba1cbe92da1ad3858c776c7fa631fb9855dc"} Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.262147 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acd18657-f02e-4b2f-8ec6-e46b2408e720","Type":"ContainerDied","Data":"04924f97357adea774932af9153dad042bc0009a7d2cfe2cd95279c95090affa"} Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.262166 5036 scope.go:117] "RemoveContainer" containerID="b8a92dd2d8aa3182e7e1ad6e40759affdf7b105a449c5e945540c99880621d35" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.362465 5036 scope.go:117] "RemoveContainer" containerID="df47d644831e788ecfaf9ef87a2766c5b00ea6efd48a54d1874c6dfa8fa44abd" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.450904 5036 scope.go:117] "RemoveContainer" containerID="70616488c1da8e0d8e5fd028915cba1cbe92da1ad3858c776c7fa631fb9855dc" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.462910 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.485157 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.509786 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:44:40 crc kubenswrapper[5036]: E0110 16:44:40.510254 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acd18657-f02e-4b2f-8ec6-e46b2408e720" containerName="proxy-httpd" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.510267 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd18657-f02e-4b2f-8ec6-e46b2408e720" containerName="proxy-httpd" Jan 10 16:44:40 crc kubenswrapper[5036]: E0110 16:44:40.510283 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acd18657-f02e-4b2f-8ec6-e46b2408e720" containerName="sg-core" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.510292 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd18657-f02e-4b2f-8ec6-e46b2408e720" containerName="sg-core" Jan 10 16:44:40 crc kubenswrapper[5036]: E0110 16:44:40.510308 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acd18657-f02e-4b2f-8ec6-e46b2408e720" containerName="ceilometer-notification-agent" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.510314 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd18657-f02e-4b2f-8ec6-e46b2408e720" containerName="ceilometer-notification-agent" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.510477 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="acd18657-f02e-4b2f-8ec6-e46b2408e720" containerName="sg-core" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.510493 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="acd18657-f02e-4b2f-8ec6-e46b2408e720" containerName="proxy-httpd" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.510504 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="acd18657-f02e-4b2f-8ec6-e46b2408e720" containerName="ceilometer-notification-agent" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.512548 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.525904 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.527122 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.536997 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc445\" (UniqueName: \"kubernetes.io/projected/160d1d1b-ff02-4e83-8f14-35f21877666a-kube-api-access-wc445\") pod \"ceilometer-0\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " pod="openstack/ceilometer-0" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.537077 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/160d1d1b-ff02-4e83-8f14-35f21877666a-log-httpd\") pod \"ceilometer-0\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " pod="openstack/ceilometer-0" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.537143 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/160d1d1b-ff02-4e83-8f14-35f21877666a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " pod="openstack/ceilometer-0" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.537186 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/160d1d1b-ff02-4e83-8f14-35f21877666a-scripts\") pod \"ceilometer-0\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " pod="openstack/ceilometer-0" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.537237 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/160d1d1b-ff02-4e83-8f14-35f21877666a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " pod="openstack/ceilometer-0" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.537265 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/160d1d1b-ff02-4e83-8f14-35f21877666a-run-httpd\") pod \"ceilometer-0\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " pod="openstack/ceilometer-0" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.537304 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/160d1d1b-ff02-4e83-8f14-35f21877666a-config-data\") pod \"ceilometer-0\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " pod="openstack/ceilometer-0" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.555705 5036 scope.go:117] "RemoveContainer" containerID="b8a92dd2d8aa3182e7e1ad6e40759affdf7b105a449c5e945540c99880621d35" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.558635 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acd18657-f02e-4b2f-8ec6-e46b2408e720" path="/var/lib/kubelet/pods/acd18657-f02e-4b2f-8ec6-e46b2408e720/volumes" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.560227 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:44:40 crc kubenswrapper[5036]: E0110 16:44:40.560632 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8a92dd2d8aa3182e7e1ad6e40759affdf7b105a449c5e945540c99880621d35\": container with ID starting with b8a92dd2d8aa3182e7e1ad6e40759affdf7b105a449c5e945540c99880621d35 not found: ID does not exist" containerID="b8a92dd2d8aa3182e7e1ad6e40759affdf7b105a449c5e945540c99880621d35" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.560722 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8a92dd2d8aa3182e7e1ad6e40759affdf7b105a449c5e945540c99880621d35"} err="failed to get container status \"b8a92dd2d8aa3182e7e1ad6e40759affdf7b105a449c5e945540c99880621d35\": rpc error: code = NotFound desc = could not find container \"b8a92dd2d8aa3182e7e1ad6e40759affdf7b105a449c5e945540c99880621d35\": container with ID starting with b8a92dd2d8aa3182e7e1ad6e40759affdf7b105a449c5e945540c99880621d35 not found: ID does not exist" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.560772 5036 scope.go:117] "RemoveContainer" containerID="df47d644831e788ecfaf9ef87a2766c5b00ea6efd48a54d1874c6dfa8fa44abd" Jan 10 16:44:40 crc kubenswrapper[5036]: E0110 16:44:40.561221 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df47d644831e788ecfaf9ef87a2766c5b00ea6efd48a54d1874c6dfa8fa44abd\": container with ID starting with df47d644831e788ecfaf9ef87a2766c5b00ea6efd48a54d1874c6dfa8fa44abd not found: ID does not exist" containerID="df47d644831e788ecfaf9ef87a2766c5b00ea6efd48a54d1874c6dfa8fa44abd" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.561253 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df47d644831e788ecfaf9ef87a2766c5b00ea6efd48a54d1874c6dfa8fa44abd"} err="failed to get container status \"df47d644831e788ecfaf9ef87a2766c5b00ea6efd48a54d1874c6dfa8fa44abd\": rpc error: code = NotFound desc = could not find container \"df47d644831e788ecfaf9ef87a2766c5b00ea6efd48a54d1874c6dfa8fa44abd\": container with ID starting with df47d644831e788ecfaf9ef87a2766c5b00ea6efd48a54d1874c6dfa8fa44abd not found: ID does not exist" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.561275 5036 scope.go:117] "RemoveContainer" containerID="70616488c1da8e0d8e5fd028915cba1cbe92da1ad3858c776c7fa631fb9855dc" Jan 10 16:44:40 crc kubenswrapper[5036]: E0110 16:44:40.561619 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70616488c1da8e0d8e5fd028915cba1cbe92da1ad3858c776c7fa631fb9855dc\": container with ID starting with 70616488c1da8e0d8e5fd028915cba1cbe92da1ad3858c776c7fa631fb9855dc not found: ID does not exist" containerID="70616488c1da8e0d8e5fd028915cba1cbe92da1ad3858c776c7fa631fb9855dc" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.561641 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70616488c1da8e0d8e5fd028915cba1cbe92da1ad3858c776c7fa631fb9855dc"} err="failed to get container status \"70616488c1da8e0d8e5fd028915cba1cbe92da1ad3858c776c7fa631fb9855dc\": rpc error: code = NotFound desc = could not find container \"70616488c1da8e0d8e5fd028915cba1cbe92da1ad3858c776c7fa631fb9855dc\": container with ID starting with 70616488c1da8e0d8e5fd028915cba1cbe92da1ad3858c776c7fa631fb9855dc not found: ID does not exist" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.638470 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc445\" (UniqueName: \"kubernetes.io/projected/160d1d1b-ff02-4e83-8f14-35f21877666a-kube-api-access-wc445\") pod \"ceilometer-0\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " pod="openstack/ceilometer-0" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.638547 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/160d1d1b-ff02-4e83-8f14-35f21877666a-log-httpd\") pod \"ceilometer-0\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " pod="openstack/ceilometer-0" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.638611 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/160d1d1b-ff02-4e83-8f14-35f21877666a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " pod="openstack/ceilometer-0" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.638651 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/160d1d1b-ff02-4e83-8f14-35f21877666a-scripts\") pod \"ceilometer-0\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " pod="openstack/ceilometer-0" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.638706 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/160d1d1b-ff02-4e83-8f14-35f21877666a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " pod="openstack/ceilometer-0" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.638734 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/160d1d1b-ff02-4e83-8f14-35f21877666a-run-httpd\") pod \"ceilometer-0\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " pod="openstack/ceilometer-0" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.638767 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/160d1d1b-ff02-4e83-8f14-35f21877666a-config-data\") pod \"ceilometer-0\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " pod="openstack/ceilometer-0" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.640150 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/160d1d1b-ff02-4e83-8f14-35f21877666a-log-httpd\") pod \"ceilometer-0\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " pod="openstack/ceilometer-0" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.642992 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/160d1d1b-ff02-4e83-8f14-35f21877666a-scripts\") pod \"ceilometer-0\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " pod="openstack/ceilometer-0" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.644338 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/160d1d1b-ff02-4e83-8f14-35f21877666a-run-httpd\") pod \"ceilometer-0\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " pod="openstack/ceilometer-0" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.645187 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/160d1d1b-ff02-4e83-8f14-35f21877666a-config-data\") pod \"ceilometer-0\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " pod="openstack/ceilometer-0" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.654533 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/160d1d1b-ff02-4e83-8f14-35f21877666a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " pod="openstack/ceilometer-0" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.655032 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/160d1d1b-ff02-4e83-8f14-35f21877666a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " pod="openstack/ceilometer-0" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.662553 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc445\" (UniqueName: \"kubernetes.io/projected/160d1d1b-ff02-4e83-8f14-35f21877666a-kube-api-access-wc445\") pod \"ceilometer-0\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " pod="openstack/ceilometer-0" Jan 10 16:44:40 crc kubenswrapper[5036]: I0110 16:44:40.878089 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.188635 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.271997 5036 generic.go:334] "Generic (PLEG): container finished" podID="7789491d-15dc-44e0-88d9-141ba4009010" containerID="4c9a4f26c5c3c2fff3f468f9e247fe597705dd13b66d99a167ea534c903e25c4" exitCode=0 Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.272073 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" event={"ID":"7789491d-15dc-44e0-88d9-141ba4009010","Type":"ContainerDied","Data":"4c9a4f26c5c3c2fff3f468f9e247fe597705dd13b66d99a167ea534c903e25c4"} Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.272090 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.272117 5036 scope.go:117] "RemoveContainer" containerID="4c9a4f26c5c3c2fff3f468f9e247fe597705dd13b66d99a167ea534c903e25c4" Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.272104 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-vrp42" event={"ID":"7789491d-15dc-44e0-88d9-141ba4009010","Type":"ContainerDied","Data":"43c23df4c28a7f82a7a7e9ceea55f0dc6084cc1ac9e90172ab95e5de6224f1dd"} Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.316413 5036 scope.go:117] "RemoveContainer" containerID="7419f13515967cfbca2537638e353578285e56e998029aa16bc7dd489d03aee8" Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.339719 5036 scope.go:117] "RemoveContainer" containerID="4c9a4f26c5c3c2fff3f468f9e247fe597705dd13b66d99a167ea534c903e25c4" Jan 10 16:44:41 crc kubenswrapper[5036]: E0110 16:44:41.340230 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c9a4f26c5c3c2fff3f468f9e247fe597705dd13b66d99a167ea534c903e25c4\": container with ID starting with 4c9a4f26c5c3c2fff3f468f9e247fe597705dd13b66d99a167ea534c903e25c4 not found: ID does not exist" containerID="4c9a4f26c5c3c2fff3f468f9e247fe597705dd13b66d99a167ea534c903e25c4" Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.340273 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c9a4f26c5c3c2fff3f468f9e247fe597705dd13b66d99a167ea534c903e25c4"} err="failed to get container status \"4c9a4f26c5c3c2fff3f468f9e247fe597705dd13b66d99a167ea534c903e25c4\": rpc error: code = NotFound desc = could not find container \"4c9a4f26c5c3c2fff3f468f9e247fe597705dd13b66d99a167ea534c903e25c4\": container with ID starting with 4c9a4f26c5c3c2fff3f468f9e247fe597705dd13b66d99a167ea534c903e25c4 not found: ID does not exist" Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.340299 5036 scope.go:117] "RemoveContainer" containerID="7419f13515967cfbca2537638e353578285e56e998029aa16bc7dd489d03aee8" Jan 10 16:44:41 crc kubenswrapper[5036]: E0110 16:44:41.340710 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7419f13515967cfbca2537638e353578285e56e998029aa16bc7dd489d03aee8\": container with ID starting with 7419f13515967cfbca2537638e353578285e56e998029aa16bc7dd489d03aee8 not found: ID does not exist" containerID="7419f13515967cfbca2537638e353578285e56e998029aa16bc7dd489d03aee8" Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.340743 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7419f13515967cfbca2537638e353578285e56e998029aa16bc7dd489d03aee8"} err="failed to get container status \"7419f13515967cfbca2537638e353578285e56e998029aa16bc7dd489d03aee8\": rpc error: code = NotFound desc = could not find container \"7419f13515967cfbca2537638e353578285e56e998029aa16bc7dd489d03aee8\": container with ID starting with 7419f13515967cfbca2537638e353578285e56e998029aa16bc7dd489d03aee8 not found: ID does not exist" Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.351456 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7789491d-15dc-44e0-88d9-141ba4009010-dns-svc\") pod \"7789491d-15dc-44e0-88d9-141ba4009010\" (UID: \"7789491d-15dc-44e0-88d9-141ba4009010\") " Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.351514 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7789491d-15dc-44e0-88d9-141ba4009010-ovsdbserver-nb\") pod \"7789491d-15dc-44e0-88d9-141ba4009010\" (UID: \"7789491d-15dc-44e0-88d9-141ba4009010\") " Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.351765 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7789491d-15dc-44e0-88d9-141ba4009010-config\") pod \"7789491d-15dc-44e0-88d9-141ba4009010\" (UID: \"7789491d-15dc-44e0-88d9-141ba4009010\") " Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.351789 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7789491d-15dc-44e0-88d9-141ba4009010-ovsdbserver-sb\") pod \"7789491d-15dc-44e0-88d9-141ba4009010\" (UID: \"7789491d-15dc-44e0-88d9-141ba4009010\") " Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.351825 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7prxr\" (UniqueName: \"kubernetes.io/projected/7789491d-15dc-44e0-88d9-141ba4009010-kube-api-access-7prxr\") pod \"7789491d-15dc-44e0-88d9-141ba4009010\" (UID: \"7789491d-15dc-44e0-88d9-141ba4009010\") " Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.367805 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7789491d-15dc-44e0-88d9-141ba4009010-kube-api-access-7prxr" (OuterVolumeSpecName: "kube-api-access-7prxr") pod "7789491d-15dc-44e0-88d9-141ba4009010" (UID: "7789491d-15dc-44e0-88d9-141ba4009010"). InnerVolumeSpecName "kube-api-access-7prxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.378634 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.403982 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7789491d-15dc-44e0-88d9-141ba4009010-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7789491d-15dc-44e0-88d9-141ba4009010" (UID: "7789491d-15dc-44e0-88d9-141ba4009010"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.407330 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7789491d-15dc-44e0-88d9-141ba4009010-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7789491d-15dc-44e0-88d9-141ba4009010" (UID: "7789491d-15dc-44e0-88d9-141ba4009010"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.409850 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7789491d-15dc-44e0-88d9-141ba4009010-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7789491d-15dc-44e0-88d9-141ba4009010" (UID: "7789491d-15dc-44e0-88d9-141ba4009010"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.413857 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7789491d-15dc-44e0-88d9-141ba4009010-config" (OuterVolumeSpecName: "config") pod "7789491d-15dc-44e0-88d9-141ba4009010" (UID: "7789491d-15dc-44e0-88d9-141ba4009010"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.453803 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7789491d-15dc-44e0-88d9-141ba4009010-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.453828 5036 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7789491d-15dc-44e0-88d9-141ba4009010-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.453839 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7prxr\" (UniqueName: \"kubernetes.io/projected/7789491d-15dc-44e0-88d9-141ba4009010-kube-api-access-7prxr\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.453848 5036 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7789491d-15dc-44e0-88d9-141ba4009010-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.453857 5036 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7789491d-15dc-44e0-88d9-141ba4009010-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.667080 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-vrp42"] Jan 10 16:44:41 crc kubenswrapper[5036]: I0110 16:44:41.678246 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-vrp42"] Jan 10 16:44:42 crc kubenswrapper[5036]: I0110 16:44:42.320769 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"160d1d1b-ff02-4e83-8f14-35f21877666a","Type":"ContainerStarted","Data":"9799122ac9cadc514c6fd60701f5da3efe5af896d32d9a346b0cdb47f0541856"} Jan 10 16:44:42 crc kubenswrapper[5036]: I0110 16:44:42.321217 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"160d1d1b-ff02-4e83-8f14-35f21877666a","Type":"ContainerStarted","Data":"56ce3b2a2af81f44226d3f4737cb8c809135adadec95a794922aa3fa50874f33"} Jan 10 16:44:42 crc kubenswrapper[5036]: I0110 16:44:42.524347 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7789491d-15dc-44e0-88d9-141ba4009010" path="/var/lib/kubelet/pods/7789491d-15dc-44e0-88d9-141ba4009010/volumes" Jan 10 16:44:43 crc kubenswrapper[5036]: I0110 16:44:43.330523 5036 generic.go:334] "Generic (PLEG): container finished" podID="6fe6dd46-603d-4595-ad27-32f98623fbcc" containerID="cbfd17d699134a608165d5a7c768f8f1ef9c076a375f09a1733cc2038619dbbb" exitCode=0 Jan 10 16:44:43 crc kubenswrapper[5036]: I0110 16:44:43.330703 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-j9crs" event={"ID":"6fe6dd46-603d-4595-ad27-32f98623fbcc","Type":"ContainerDied","Data":"cbfd17d699134a608165d5a7c768f8f1ef9c076a375f09a1733cc2038619dbbb"} Jan 10 16:44:43 crc kubenswrapper[5036]: I0110 16:44:43.335894 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"160d1d1b-ff02-4e83-8f14-35f21877666a","Type":"ContainerStarted","Data":"9e3efd2ace4f39b8c56d3a06ba8e665c0d02dec1e74513bab6edfc7c5675bec4"} Jan 10 16:44:43 crc kubenswrapper[5036]: I0110 16:44:43.335944 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"160d1d1b-ff02-4e83-8f14-35f21877666a","Type":"ContainerStarted","Data":"4b6a521a87e11a81dcf372ef9d887f370255d1aed3c815e5dc2a5e224e733d2a"} Jan 10 16:44:44 crc kubenswrapper[5036]: I0110 16:44:44.774622 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-j9crs" Jan 10 16:44:44 crc kubenswrapper[5036]: I0110 16:44:44.814774 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe6dd46-603d-4595-ad27-32f98623fbcc-config-data\") pod \"6fe6dd46-603d-4595-ad27-32f98623fbcc\" (UID: \"6fe6dd46-603d-4595-ad27-32f98623fbcc\") " Jan 10 16:44:44 crc kubenswrapper[5036]: I0110 16:44:44.869024 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe6dd46-603d-4595-ad27-32f98623fbcc-config-data" (OuterVolumeSpecName: "config-data") pod "6fe6dd46-603d-4595-ad27-32f98623fbcc" (UID: "6fe6dd46-603d-4595-ad27-32f98623fbcc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:44 crc kubenswrapper[5036]: I0110 16:44:44.916778 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6fe6dd46-603d-4595-ad27-32f98623fbcc-db-sync-config-data\") pod \"6fe6dd46-603d-4595-ad27-32f98623fbcc\" (UID: \"6fe6dd46-603d-4595-ad27-32f98623fbcc\") " Jan 10 16:44:44 crc kubenswrapper[5036]: I0110 16:44:44.917039 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fe6dd46-603d-4595-ad27-32f98623fbcc-scripts\") pod \"6fe6dd46-603d-4595-ad27-32f98623fbcc\" (UID: \"6fe6dd46-603d-4595-ad27-32f98623fbcc\") " Jan 10 16:44:44 crc kubenswrapper[5036]: I0110 16:44:44.917118 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe6dd46-603d-4595-ad27-32f98623fbcc-combined-ca-bundle\") pod \"6fe6dd46-603d-4595-ad27-32f98623fbcc\" (UID: \"6fe6dd46-603d-4595-ad27-32f98623fbcc\") " Jan 10 16:44:44 crc kubenswrapper[5036]: I0110 16:44:44.917278 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6fe6dd46-603d-4595-ad27-32f98623fbcc-etc-machine-id\") pod \"6fe6dd46-603d-4595-ad27-32f98623fbcc\" (UID: \"6fe6dd46-603d-4595-ad27-32f98623fbcc\") " Jan 10 16:44:44 crc kubenswrapper[5036]: I0110 16:44:44.917374 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lt62\" (UniqueName: \"kubernetes.io/projected/6fe6dd46-603d-4595-ad27-32f98623fbcc-kube-api-access-5lt62\") pod \"6fe6dd46-603d-4595-ad27-32f98623fbcc\" (UID: \"6fe6dd46-603d-4595-ad27-32f98623fbcc\") " Jan 10 16:44:44 crc kubenswrapper[5036]: I0110 16:44:44.917790 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe6dd46-603d-4595-ad27-32f98623fbcc-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:44 crc kubenswrapper[5036]: I0110 16:44:44.917944 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fe6dd46-603d-4595-ad27-32f98623fbcc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6fe6dd46-603d-4595-ad27-32f98623fbcc" (UID: "6fe6dd46-603d-4595-ad27-32f98623fbcc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:44:44 crc kubenswrapper[5036]: I0110 16:44:44.920065 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe6dd46-603d-4595-ad27-32f98623fbcc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6fe6dd46-603d-4595-ad27-32f98623fbcc" (UID: "6fe6dd46-603d-4595-ad27-32f98623fbcc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:44 crc kubenswrapper[5036]: I0110 16:44:44.920490 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe6dd46-603d-4595-ad27-32f98623fbcc-scripts" (OuterVolumeSpecName: "scripts") pod "6fe6dd46-603d-4595-ad27-32f98623fbcc" (UID: "6fe6dd46-603d-4595-ad27-32f98623fbcc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:44 crc kubenswrapper[5036]: I0110 16:44:44.921866 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe6dd46-603d-4595-ad27-32f98623fbcc-kube-api-access-5lt62" (OuterVolumeSpecName: "kube-api-access-5lt62") pod "6fe6dd46-603d-4595-ad27-32f98623fbcc" (UID: "6fe6dd46-603d-4595-ad27-32f98623fbcc"). InnerVolumeSpecName "kube-api-access-5lt62". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:44:44 crc kubenswrapper[5036]: I0110 16:44:44.940433 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe6dd46-603d-4595-ad27-32f98623fbcc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6fe6dd46-603d-4595-ad27-32f98623fbcc" (UID: "6fe6dd46-603d-4595-ad27-32f98623fbcc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.019613 5036 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6fe6dd46-603d-4595-ad27-32f98623fbcc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.019652 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lt62\" (UniqueName: \"kubernetes.io/projected/6fe6dd46-603d-4595-ad27-32f98623fbcc-kube-api-access-5lt62\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.019667 5036 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6fe6dd46-603d-4595-ad27-32f98623fbcc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.019702 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fe6dd46-603d-4595-ad27-32f98623fbcc-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.019715 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe6dd46-603d-4595-ad27-32f98623fbcc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.354451 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-j9crs" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.354653 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-j9crs" event={"ID":"6fe6dd46-603d-4595-ad27-32f98623fbcc","Type":"ContainerDied","Data":"4686c23e6ac940c49e70fd5772cc4abb964c2a93cf919775231b8ac1c5e53870"} Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.355295 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4686c23e6ac940c49e70fd5772cc4abb964c2a93cf919775231b8ac1c5e53870" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.357278 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"160d1d1b-ff02-4e83-8f14-35f21877666a","Type":"ContainerStarted","Data":"4be2ffd36cc0ed2610d2f497f412517d63926a155d7077935831086234c81d70"} Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.358671 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.395373 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.175133169 podStartE2EDuration="5.395357906s" podCreationTimestamp="2026-01-10 16:44:40 +0000 UTC" firstStartedPulling="2026-01-10 16:44:41.374603111 +0000 UTC m=+1003.244838605" lastFinishedPulling="2026-01-10 16:44:44.594827838 +0000 UTC m=+1006.465063342" observedRunningTime="2026-01-10 16:44:45.38423878 +0000 UTC m=+1007.254474294" watchObservedRunningTime="2026-01-10 16:44:45.395357906 +0000 UTC m=+1007.265593400" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.651187 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 10 16:44:45 crc kubenswrapper[5036]: E0110 16:44:45.656674 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7789491d-15dc-44e0-88d9-141ba4009010" containerName="init" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.656732 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="7789491d-15dc-44e0-88d9-141ba4009010" containerName="init" Jan 10 16:44:45 crc kubenswrapper[5036]: E0110 16:44:45.656767 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7789491d-15dc-44e0-88d9-141ba4009010" containerName="dnsmasq-dns" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.656777 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="7789491d-15dc-44e0-88d9-141ba4009010" containerName="dnsmasq-dns" Jan 10 16:44:45 crc kubenswrapper[5036]: E0110 16:44:45.656795 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe6dd46-603d-4595-ad27-32f98623fbcc" containerName="cinder-db-sync" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.656804 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe6dd46-603d-4595-ad27-32f98623fbcc" containerName="cinder-db-sync" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.657227 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="7789491d-15dc-44e0-88d9-141ba4009010" containerName="dnsmasq-dns" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.657251 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fe6dd46-603d-4595-ad27-32f98623fbcc" containerName="cinder-db-sync" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.658390 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.662439 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-6zp6p" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.662479 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.662542 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.662877 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.667938 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.739342 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-zv5r2"] Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.747179 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.759712 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-zv5r2"] Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.830617 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.831994 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.833092 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f51970d-278c-486c-bb97-000949f83751-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6f51970d-278c-486c-bb97-000949f83751\") " pod="openstack/cinder-scheduler-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.833160 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f51970d-278c-486c-bb97-000949f83751-scripts\") pod \"cinder-scheduler-0\" (UID: \"6f51970d-278c-486c-bb97-000949f83751\") " pod="openstack/cinder-scheduler-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.833284 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qjnd\" (UniqueName: \"kubernetes.io/projected/6f51970d-278c-486c-bb97-000949f83751-kube-api-access-6qjnd\") pod \"cinder-scheduler-0\" (UID: \"6f51970d-278c-486c-bb97-000949f83751\") " pod="openstack/cinder-scheduler-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.833316 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f51970d-278c-486c-bb97-000949f83751-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6f51970d-278c-486c-bb97-000949f83751\") " pod="openstack/cinder-scheduler-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.833349 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f51970d-278c-486c-bb97-000949f83751-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6f51970d-278c-486c-bb97-000949f83751\") " pod="openstack/cinder-scheduler-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.833517 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f51970d-278c-486c-bb97-000949f83751-config-data\") pod \"cinder-scheduler-0\" (UID: \"6f51970d-278c-486c-bb97-000949f83751\") " pod="openstack/cinder-scheduler-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.835311 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.842369 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.934871 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-scripts\") pod \"cinder-api-0\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " pod="openstack/cinder-api-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.934954 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f51970d-278c-486c-bb97-000949f83751-config-data\") pod \"cinder-scheduler-0\" (UID: \"6f51970d-278c-486c-bb97-000949f83751\") " pod="openstack/cinder-scheduler-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.934999 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-zv5r2\" (UID: \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.935025 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " pod="openstack/cinder-api-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.935066 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f51970d-278c-486c-bb97-000949f83751-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6f51970d-278c-486c-bb97-000949f83751\") " pod="openstack/cinder-scheduler-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.935120 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-zv5r2\" (UID: \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.935142 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-877bb\" (UniqueName: \"kubernetes.io/projected/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-kube-api-access-877bb\") pod \"dnsmasq-dns-6d97fcdd8f-zv5r2\" (UID: \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.935190 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f51970d-278c-486c-bb97-000949f83751-scripts\") pod \"cinder-scheduler-0\" (UID: \"6f51970d-278c-486c-bb97-000949f83751\") " pod="openstack/cinder-scheduler-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.935215 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " pod="openstack/cinder-api-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.935274 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-config-data\") pod \"cinder-api-0\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " pod="openstack/cinder-api-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.935311 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-logs\") pod \"cinder-api-0\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " pod="openstack/cinder-api-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.935383 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qjnd\" (UniqueName: \"kubernetes.io/projected/6f51970d-278c-486c-bb97-000949f83751-kube-api-access-6qjnd\") pod \"cinder-scheduler-0\" (UID: \"6f51970d-278c-486c-bb97-000949f83751\") " pod="openstack/cinder-scheduler-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.935409 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f51970d-278c-486c-bb97-000949f83751-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6f51970d-278c-486c-bb97-000949f83751\") " pod="openstack/cinder-scheduler-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.935433 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-zv5r2\" (UID: \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.935470 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-config\") pod \"dnsmasq-dns-6d97fcdd8f-zv5r2\" (UID: \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.935493 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2pzx\" (UniqueName: \"kubernetes.io/projected/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-kube-api-access-v2pzx\") pod \"cinder-api-0\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " pod="openstack/cinder-api-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.935514 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f51970d-278c-486c-bb97-000949f83751-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6f51970d-278c-486c-bb97-000949f83751\") " pod="openstack/cinder-scheduler-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.935548 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-config-data-custom\") pod \"cinder-api-0\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " pod="openstack/cinder-api-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.935657 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f51970d-278c-486c-bb97-000949f83751-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6f51970d-278c-486c-bb97-000949f83751\") " pod="openstack/cinder-scheduler-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.939812 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f51970d-278c-486c-bb97-000949f83751-scripts\") pod \"cinder-scheduler-0\" (UID: \"6f51970d-278c-486c-bb97-000949f83751\") " pod="openstack/cinder-scheduler-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.940378 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f51970d-278c-486c-bb97-000949f83751-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6f51970d-278c-486c-bb97-000949f83751\") " pod="openstack/cinder-scheduler-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.943398 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f51970d-278c-486c-bb97-000949f83751-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6f51970d-278c-486c-bb97-000949f83751\") " pod="openstack/cinder-scheduler-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.952508 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f51970d-278c-486c-bb97-000949f83751-config-data\") pod \"cinder-scheduler-0\" (UID: \"6f51970d-278c-486c-bb97-000949f83751\") " pod="openstack/cinder-scheduler-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.953015 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qjnd\" (UniqueName: \"kubernetes.io/projected/6f51970d-278c-486c-bb97-000949f83751-kube-api-access-6qjnd\") pod \"cinder-scheduler-0\" (UID: \"6f51970d-278c-486c-bb97-000949f83751\") " pod="openstack/cinder-scheduler-0" Jan 10 16:44:45 crc kubenswrapper[5036]: I0110 16:44:45.990538 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.037484 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-config-data-custom\") pod \"cinder-api-0\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " pod="openstack/cinder-api-0" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.037527 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-scripts\") pod \"cinder-api-0\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " pod="openstack/cinder-api-0" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.037570 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-zv5r2\" (UID: \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.037595 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " pod="openstack/cinder-api-0" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.037639 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-zv5r2\" (UID: \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.037661 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-877bb\" (UniqueName: \"kubernetes.io/projected/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-kube-api-access-877bb\") pod \"dnsmasq-dns-6d97fcdd8f-zv5r2\" (UID: \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.037697 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " pod="openstack/cinder-api-0" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.037763 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-config-data\") pod \"cinder-api-0\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " pod="openstack/cinder-api-0" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.037782 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-logs\") pod \"cinder-api-0\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " pod="openstack/cinder-api-0" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.037809 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-zv5r2\" (UID: \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.037834 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-config\") pod \"dnsmasq-dns-6d97fcdd8f-zv5r2\" (UID: \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.037853 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2pzx\" (UniqueName: \"kubernetes.io/projected/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-kube-api-access-v2pzx\") pod \"cinder-api-0\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " pod="openstack/cinder-api-0" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.038195 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " pod="openstack/cinder-api-0" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.038870 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-zv5r2\" (UID: \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.039507 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-zv5r2\" (UID: \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.039849 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-logs\") pod \"cinder-api-0\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " pod="openstack/cinder-api-0" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.039903 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-zv5r2\" (UID: \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.040103 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-config\") pod \"dnsmasq-dns-6d97fcdd8f-zv5r2\" (UID: \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.043339 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-config-data-custom\") pod \"cinder-api-0\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " pod="openstack/cinder-api-0" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.044130 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-scripts\") pod \"cinder-api-0\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " pod="openstack/cinder-api-0" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.051089 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-config-data\") pod \"cinder-api-0\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " pod="openstack/cinder-api-0" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.052975 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " pod="openstack/cinder-api-0" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.062550 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-877bb\" (UniqueName: \"kubernetes.io/projected/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-kube-api-access-877bb\") pod \"dnsmasq-dns-6d97fcdd8f-zv5r2\" (UID: \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.062802 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2pzx\" (UniqueName: \"kubernetes.io/projected/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-kube-api-access-v2pzx\") pod \"cinder-api-0\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " pod="openstack/cinder-api-0" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.071051 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.157767 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.487167 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.630984 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-zv5r2"] Jan 10 16:44:46 crc kubenswrapper[5036]: I0110 16:44:46.734961 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 10 16:44:47 crc kubenswrapper[5036]: I0110 16:44:47.379953 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"efd4f03f-660d-48f2-8cd7-8c958d78e2b1","Type":"ContainerStarted","Data":"6a58c222e1eab74e51af3f1e8bf9a3bfc752dd5f2ec62762a49ce4cde6d621a8"} Jan 10 16:44:47 crc kubenswrapper[5036]: I0110 16:44:47.381376 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6f51970d-278c-486c-bb97-000949f83751","Type":"ContainerStarted","Data":"38dbf409bcb488b4aa2ba34222e30cde6c7637192557ac1df28b13ffb7088194"} Jan 10 16:44:47 crc kubenswrapper[5036]: I0110 16:44:47.382839 5036 generic.go:334] "Generic (PLEG): container finished" podID="68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac" containerID="e81520bc27313e433ac11a4c5115d433794d2013edc4e2669d381b538e0e9098" exitCode=0 Jan 10 16:44:47 crc kubenswrapper[5036]: I0110 16:44:47.382952 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" event={"ID":"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac","Type":"ContainerDied","Data":"e81520bc27313e433ac11a4c5115d433794d2013edc4e2669d381b538e0e9098"} Jan 10 16:44:47 crc kubenswrapper[5036]: I0110 16:44:47.383084 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" event={"ID":"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac","Type":"ContainerStarted","Data":"0ea23acb1b5f4d0fef3c4f9b97ca551ff6835da5b1a12cf653c4d7aa73b1ee5c"} Jan 10 16:44:47 crc kubenswrapper[5036]: I0110 16:44:47.813856 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 10 16:44:48 crc kubenswrapper[5036]: I0110 16:44:48.475919 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"efd4f03f-660d-48f2-8cd7-8c958d78e2b1","Type":"ContainerStarted","Data":"08a5955fe89d864ae7a107caf19c69a4101115ac8e8fe9ea789592d5db8c6293"} Jan 10 16:44:48 crc kubenswrapper[5036]: I0110 16:44:48.489219 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6f51970d-278c-486c-bb97-000949f83751","Type":"ContainerStarted","Data":"3b2fd17cd241fee7780cfbf86a0669532974a0573fca26c8f423f0cb9be2451d"} Jan 10 16:44:48 crc kubenswrapper[5036]: I0110 16:44:48.493244 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" event={"ID":"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac","Type":"ContainerStarted","Data":"9ecf3be69c813eac55bc3cabc940b42fb5baf783c6673aee83b2d6b3f92f966b"} Jan 10 16:44:48 crc kubenswrapper[5036]: I0110 16:44:48.493588 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" Jan 10 16:44:48 crc kubenswrapper[5036]: I0110 16:44:48.513240 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" podStartSLOduration=3.513216336 podStartE2EDuration="3.513216336s" podCreationTimestamp="2026-01-10 16:44:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:44:48.5115709 +0000 UTC m=+1010.381806404" watchObservedRunningTime="2026-01-10 16:44:48.513216336 +0000 UTC m=+1010.383451840" Jan 10 16:44:48 crc kubenswrapper[5036]: I0110 16:44:48.521673 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:48 crc kubenswrapper[5036]: I0110 16:44:48.562615 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7655587964-dzfxf" Jan 10 16:44:48 crc kubenswrapper[5036]: I0110 16:44:48.649917 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-69f754595d-jrtgk"] Jan 10 16:44:48 crc kubenswrapper[5036]: I0110 16:44:48.650394 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-69f754595d-jrtgk" podUID="8aa25bba-4193-480c-91ab-2dd659103e99" containerName="barbican-api-log" containerID="cri-o://6c683e6434ac45501466f3d13add3a1d1d76e1583297689d8eb4362dd106c3db" gracePeriod=30 Jan 10 16:44:48 crc kubenswrapper[5036]: I0110 16:44:48.650508 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-69f754595d-jrtgk" podUID="8aa25bba-4193-480c-91ab-2dd659103e99" containerName="barbican-api" containerID="cri-o://0f91198ab17d13bdb28b67a429899a2b18f71ee6fd58b3b2788616f1b0f3eefd" gracePeriod=30 Jan 10 16:44:49 crc kubenswrapper[5036]: I0110 16:44:49.502444 5036 generic.go:334] "Generic (PLEG): container finished" podID="8aa25bba-4193-480c-91ab-2dd659103e99" containerID="6c683e6434ac45501466f3d13add3a1d1d76e1583297689d8eb4362dd106c3db" exitCode=143 Jan 10 16:44:49 crc kubenswrapper[5036]: I0110 16:44:49.502853 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69f754595d-jrtgk" event={"ID":"8aa25bba-4193-480c-91ab-2dd659103e99","Type":"ContainerDied","Data":"6c683e6434ac45501466f3d13add3a1d1d76e1583297689d8eb4362dd106c3db"} Jan 10 16:44:49 crc kubenswrapper[5036]: I0110 16:44:49.507235 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6f51970d-278c-486c-bb97-000949f83751","Type":"ContainerStarted","Data":"0b1478aa7a2900fc01315b10adacd108c9cd7423c88c1d22c5fe80b9aeb93d00"} Jan 10 16:44:49 crc kubenswrapper[5036]: I0110 16:44:49.512749 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"efd4f03f-660d-48f2-8cd7-8c958d78e2b1","Type":"ContainerStarted","Data":"c8a1189274f0116a725e3bfa87bf659c2a7c6b790c43095b2fe376120aae832f"} Jan 10 16:44:49 crc kubenswrapper[5036]: I0110 16:44:49.512962 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="efd4f03f-660d-48f2-8cd7-8c958d78e2b1" containerName="cinder-api-log" containerID="cri-o://08a5955fe89d864ae7a107caf19c69a4101115ac8e8fe9ea789592d5db8c6293" gracePeriod=30 Jan 10 16:44:49 crc kubenswrapper[5036]: I0110 16:44:49.513037 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="efd4f03f-660d-48f2-8cd7-8c958d78e2b1" containerName="cinder-api" containerID="cri-o://c8a1189274f0116a725e3bfa87bf659c2a7c6b790c43095b2fe376120aae832f" gracePeriod=30 Jan 10 16:44:49 crc kubenswrapper[5036]: I0110 16:44:49.513334 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 10 16:44:49 crc kubenswrapper[5036]: I0110 16:44:49.543152 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.712730142 podStartE2EDuration="4.543132789s" podCreationTimestamp="2026-01-10 16:44:45 +0000 UTC" firstStartedPulling="2026-01-10 16:44:46.498757667 +0000 UTC m=+1008.368993161" lastFinishedPulling="2026-01-10 16:44:47.329160314 +0000 UTC m=+1009.199395808" observedRunningTime="2026-01-10 16:44:49.53576283 +0000 UTC m=+1011.405998324" watchObservedRunningTime="2026-01-10 16:44:49.543132789 +0000 UTC m=+1011.413368283" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.128263 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.289249 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-scripts\") pod \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.289398 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-config-data\") pod \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.290120 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2pzx\" (UniqueName: \"kubernetes.io/projected/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-kube-api-access-v2pzx\") pod \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.290201 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-etc-machine-id\") pod \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.290293 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-combined-ca-bundle\") pod \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.290374 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-logs\") pod \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.290434 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-config-data-custom\") pod \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\" (UID: \"efd4f03f-660d-48f2-8cd7-8c958d78e2b1\") " Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.290810 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "efd4f03f-660d-48f2-8cd7-8c958d78e2b1" (UID: "efd4f03f-660d-48f2-8cd7-8c958d78e2b1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.290995 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-logs" (OuterVolumeSpecName: "logs") pod "efd4f03f-660d-48f2-8cd7-8c958d78e2b1" (UID: "efd4f03f-660d-48f2-8cd7-8c958d78e2b1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.291152 5036 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-logs\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.291192 5036 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.302964 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-kube-api-access-v2pzx" (OuterVolumeSpecName: "kube-api-access-v2pzx") pod "efd4f03f-660d-48f2-8cd7-8c958d78e2b1" (UID: "efd4f03f-660d-48f2-8cd7-8c958d78e2b1"). InnerVolumeSpecName "kube-api-access-v2pzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.303102 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-scripts" (OuterVolumeSpecName: "scripts") pod "efd4f03f-660d-48f2-8cd7-8c958d78e2b1" (UID: "efd4f03f-660d-48f2-8cd7-8c958d78e2b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.306987 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "efd4f03f-660d-48f2-8cd7-8c958d78e2b1" (UID: "efd4f03f-660d-48f2-8cd7-8c958d78e2b1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.389798 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "efd4f03f-660d-48f2-8cd7-8c958d78e2b1" (UID: "efd4f03f-660d-48f2-8cd7-8c958d78e2b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.393629 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2pzx\" (UniqueName: \"kubernetes.io/projected/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-kube-api-access-v2pzx\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.393658 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.393668 5036 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.393693 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.396827 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-config-data" (OuterVolumeSpecName: "config-data") pod "efd4f03f-660d-48f2-8cd7-8c958d78e2b1" (UID: "efd4f03f-660d-48f2-8cd7-8c958d78e2b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.494929 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd4f03f-660d-48f2-8cd7-8c958d78e2b1-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.524318 5036 generic.go:334] "Generic (PLEG): container finished" podID="efd4f03f-660d-48f2-8cd7-8c958d78e2b1" containerID="c8a1189274f0116a725e3bfa87bf659c2a7c6b790c43095b2fe376120aae832f" exitCode=0 Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.524360 5036 generic.go:334] "Generic (PLEG): container finished" podID="efd4f03f-660d-48f2-8cd7-8c958d78e2b1" containerID="08a5955fe89d864ae7a107caf19c69a4101115ac8e8fe9ea789592d5db8c6293" exitCode=143 Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.524472 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.524841 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"efd4f03f-660d-48f2-8cd7-8c958d78e2b1","Type":"ContainerDied","Data":"c8a1189274f0116a725e3bfa87bf659c2a7c6b790c43095b2fe376120aae832f"} Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.524884 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"efd4f03f-660d-48f2-8cd7-8c958d78e2b1","Type":"ContainerDied","Data":"08a5955fe89d864ae7a107caf19c69a4101115ac8e8fe9ea789592d5db8c6293"} Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.524896 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"efd4f03f-660d-48f2-8cd7-8c958d78e2b1","Type":"ContainerDied","Data":"6a58c222e1eab74e51af3f1e8bf9a3bfc752dd5f2ec62762a49ce4cde6d621a8"} Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.524913 5036 scope.go:117] "RemoveContainer" containerID="c8a1189274f0116a725e3bfa87bf659c2a7c6b790c43095b2fe376120aae832f" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.560405 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.562573 5036 scope.go:117] "RemoveContainer" containerID="08a5955fe89d864ae7a107caf19c69a4101115ac8e8fe9ea789592d5db8c6293" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.571429 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.583398 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 10 16:44:50 crc kubenswrapper[5036]: E0110 16:44:50.583777 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efd4f03f-660d-48f2-8cd7-8c958d78e2b1" containerName="cinder-api-log" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.583805 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd4f03f-660d-48f2-8cd7-8c958d78e2b1" containerName="cinder-api-log" Jan 10 16:44:50 crc kubenswrapper[5036]: E0110 16:44:50.583835 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efd4f03f-660d-48f2-8cd7-8c958d78e2b1" containerName="cinder-api" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.583844 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd4f03f-660d-48f2-8cd7-8c958d78e2b1" containerName="cinder-api" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.584013 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="efd4f03f-660d-48f2-8cd7-8c958d78e2b1" containerName="cinder-api" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.584036 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="efd4f03f-660d-48f2-8cd7-8c958d78e2b1" containerName="cinder-api-log" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.584887 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.597062 5036 scope.go:117] "RemoveContainer" containerID="c8a1189274f0116a725e3bfa87bf659c2a7c6b790c43095b2fe376120aae832f" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.597627 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.597830 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.597944 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 10 16:44:50 crc kubenswrapper[5036]: E0110 16:44:50.597946 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8a1189274f0116a725e3bfa87bf659c2a7c6b790c43095b2fe376120aae832f\": container with ID starting with c8a1189274f0116a725e3bfa87bf659c2a7c6b790c43095b2fe376120aae832f not found: ID does not exist" containerID="c8a1189274f0116a725e3bfa87bf659c2a7c6b790c43095b2fe376120aae832f" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.597983 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8a1189274f0116a725e3bfa87bf659c2a7c6b790c43095b2fe376120aae832f"} err="failed to get container status \"c8a1189274f0116a725e3bfa87bf659c2a7c6b790c43095b2fe376120aae832f\": rpc error: code = NotFound desc = could not find container \"c8a1189274f0116a725e3bfa87bf659c2a7c6b790c43095b2fe376120aae832f\": container with ID starting with c8a1189274f0116a725e3bfa87bf659c2a7c6b790c43095b2fe376120aae832f not found: ID does not exist" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.598008 5036 scope.go:117] "RemoveContainer" containerID="08a5955fe89d864ae7a107caf19c69a4101115ac8e8fe9ea789592d5db8c6293" Jan 10 16:44:50 crc kubenswrapper[5036]: E0110 16:44:50.598300 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08a5955fe89d864ae7a107caf19c69a4101115ac8e8fe9ea789592d5db8c6293\": container with ID starting with 08a5955fe89d864ae7a107caf19c69a4101115ac8e8fe9ea789592d5db8c6293 not found: ID does not exist" containerID="08a5955fe89d864ae7a107caf19c69a4101115ac8e8fe9ea789592d5db8c6293" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.598328 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08a5955fe89d864ae7a107caf19c69a4101115ac8e8fe9ea789592d5db8c6293"} err="failed to get container status \"08a5955fe89d864ae7a107caf19c69a4101115ac8e8fe9ea789592d5db8c6293\": rpc error: code = NotFound desc = could not find container \"08a5955fe89d864ae7a107caf19c69a4101115ac8e8fe9ea789592d5db8c6293\": container with ID starting with 08a5955fe89d864ae7a107caf19c69a4101115ac8e8fe9ea789592d5db8c6293 not found: ID does not exist" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.598349 5036 scope.go:117] "RemoveContainer" containerID="c8a1189274f0116a725e3bfa87bf659c2a7c6b790c43095b2fe376120aae832f" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.598923 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8a1189274f0116a725e3bfa87bf659c2a7c6b790c43095b2fe376120aae832f"} err="failed to get container status \"c8a1189274f0116a725e3bfa87bf659c2a7c6b790c43095b2fe376120aae832f\": rpc error: code = NotFound desc = could not find container \"c8a1189274f0116a725e3bfa87bf659c2a7c6b790c43095b2fe376120aae832f\": container with ID starting with c8a1189274f0116a725e3bfa87bf659c2a7c6b790c43095b2fe376120aae832f not found: ID does not exist" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.598944 5036 scope.go:117] "RemoveContainer" containerID="08a5955fe89d864ae7a107caf19c69a4101115ac8e8fe9ea789592d5db8c6293" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.599132 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08a5955fe89d864ae7a107caf19c69a4101115ac8e8fe9ea789592d5db8c6293"} err="failed to get container status \"08a5955fe89d864ae7a107caf19c69a4101115ac8e8fe9ea789592d5db8c6293\": rpc error: code = NotFound desc = could not find container \"08a5955fe89d864ae7a107caf19c69a4101115ac8e8fe9ea789592d5db8c6293\": container with ID starting with 08a5955fe89d864ae7a107caf19c69a4101115ac8e8fe9ea789592d5db8c6293 not found: ID does not exist" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.606779 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.700832 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-config-data-custom\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.700953 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.701015 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.701105 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-config-data\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.701146 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.701185 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-scripts\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.701316 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnld6\" (UniqueName: \"kubernetes.io/projected/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-kube-api-access-vnld6\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.701486 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-logs\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.701534 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.803539 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.803630 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.803651 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-config-data\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.803699 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-scripts\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.803757 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.804407 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnld6\" (UniqueName: \"kubernetes.io/projected/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-kube-api-access-vnld6\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.804499 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-logs\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.804532 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.804633 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-config-data-custom\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.804725 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.804964 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-logs\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.808178 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-scripts\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.808329 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.810566 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-config-data\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.812493 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-config-data-custom\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.817001 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.818464 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.835396 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnld6\" (UniqueName: \"kubernetes.io/projected/04bfc371-7aba-4a4d-b018-4a79ad8a0b7b-kube-api-access-vnld6\") pod \"cinder-api-0\" (UID: \"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b\") " pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.937197 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 10 16:44:50 crc kubenswrapper[5036]: I0110 16:44:50.990969 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 10 16:44:51 crc kubenswrapper[5036]: I0110 16:44:51.441119 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 10 16:44:51 crc kubenswrapper[5036]: W0110 16:44:51.444429 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04bfc371_7aba_4a4d_b018_4a79ad8a0b7b.slice/crio-56e98b62d553ca2be4786f16cac78b5bff1c76a05e2e6750fd26ec96e6751cd5 WatchSource:0}: Error finding container 56e98b62d553ca2be4786f16cac78b5bff1c76a05e2e6750fd26ec96e6751cd5: Status 404 returned error can't find the container with id 56e98b62d553ca2be4786f16cac78b5bff1c76a05e2e6750fd26ec96e6751cd5 Jan 10 16:44:51 crc kubenswrapper[5036]: I0110 16:44:51.538094 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b","Type":"ContainerStarted","Data":"56e98b62d553ca2be4786f16cac78b5bff1c76a05e2e6750fd26ec96e6751cd5"} Jan 10 16:44:51 crc kubenswrapper[5036]: I0110 16:44:51.817545 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-69f754595d-jrtgk" podUID="8aa25bba-4193-480c-91ab-2dd659103e99" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.144:9311/healthcheck\": read tcp 10.217.0.2:53736->10.217.0.144:9311: read: connection reset by peer" Jan 10 16:44:51 crc kubenswrapper[5036]: I0110 16:44:51.817621 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-69f754595d-jrtgk" podUID="8aa25bba-4193-480c-91ab-2dd659103e99" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.144:9311/healthcheck\": read tcp 10.217.0.2:53746->10.217.0.144:9311: read: connection reset by peer" Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.164542 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69f754595d-jrtgk" Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.234188 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aa25bba-4193-480c-91ab-2dd659103e99-logs\") pod \"8aa25bba-4193-480c-91ab-2dd659103e99\" (UID: \"8aa25bba-4193-480c-91ab-2dd659103e99\") " Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.234272 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8aa25bba-4193-480c-91ab-2dd659103e99-config-data-custom\") pod \"8aa25bba-4193-480c-91ab-2dd659103e99\" (UID: \"8aa25bba-4193-480c-91ab-2dd659103e99\") " Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.234353 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aa25bba-4193-480c-91ab-2dd659103e99-combined-ca-bundle\") pod \"8aa25bba-4193-480c-91ab-2dd659103e99\" (UID: \"8aa25bba-4193-480c-91ab-2dd659103e99\") " Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.234507 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk5c9\" (UniqueName: \"kubernetes.io/projected/8aa25bba-4193-480c-91ab-2dd659103e99-kube-api-access-xk5c9\") pod \"8aa25bba-4193-480c-91ab-2dd659103e99\" (UID: \"8aa25bba-4193-480c-91ab-2dd659103e99\") " Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.234542 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aa25bba-4193-480c-91ab-2dd659103e99-config-data\") pod \"8aa25bba-4193-480c-91ab-2dd659103e99\" (UID: \"8aa25bba-4193-480c-91ab-2dd659103e99\") " Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.235085 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8aa25bba-4193-480c-91ab-2dd659103e99-logs" (OuterVolumeSpecName: "logs") pod "8aa25bba-4193-480c-91ab-2dd659103e99" (UID: "8aa25bba-4193-480c-91ab-2dd659103e99"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.241272 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa25bba-4193-480c-91ab-2dd659103e99-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8aa25bba-4193-480c-91ab-2dd659103e99" (UID: "8aa25bba-4193-480c-91ab-2dd659103e99"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.241285 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aa25bba-4193-480c-91ab-2dd659103e99-kube-api-access-xk5c9" (OuterVolumeSpecName: "kube-api-access-xk5c9") pod "8aa25bba-4193-480c-91ab-2dd659103e99" (UID: "8aa25bba-4193-480c-91ab-2dd659103e99"). InnerVolumeSpecName "kube-api-access-xk5c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.277710 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa25bba-4193-480c-91ab-2dd659103e99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8aa25bba-4193-480c-91ab-2dd659103e99" (UID: "8aa25bba-4193-480c-91ab-2dd659103e99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.284400 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa25bba-4193-480c-91ab-2dd659103e99-config-data" (OuterVolumeSpecName: "config-data") pod "8aa25bba-4193-480c-91ab-2dd659103e99" (UID: "8aa25bba-4193-480c-91ab-2dd659103e99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.336605 5036 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aa25bba-4193-480c-91ab-2dd659103e99-logs\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.336652 5036 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8aa25bba-4193-480c-91ab-2dd659103e99-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.336666 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aa25bba-4193-480c-91ab-2dd659103e99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.336726 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk5c9\" (UniqueName: \"kubernetes.io/projected/8aa25bba-4193-480c-91ab-2dd659103e99-kube-api-access-xk5c9\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.336741 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aa25bba-4193-480c-91ab-2dd659103e99-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.517689 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efd4f03f-660d-48f2-8cd7-8c958d78e2b1" path="/var/lib/kubelet/pods/efd4f03f-660d-48f2-8cd7-8c958d78e2b1/volumes" Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.549117 5036 generic.go:334] "Generic (PLEG): container finished" podID="8aa25bba-4193-480c-91ab-2dd659103e99" containerID="0f91198ab17d13bdb28b67a429899a2b18f71ee6fd58b3b2788616f1b0f3eefd" exitCode=0 Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.549249 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69f754595d-jrtgk" event={"ID":"8aa25bba-4193-480c-91ab-2dd659103e99","Type":"ContainerDied","Data":"0f91198ab17d13bdb28b67a429899a2b18f71ee6fd58b3b2788616f1b0f3eefd"} Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.549259 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69f754595d-jrtgk" Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.549303 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69f754595d-jrtgk" event={"ID":"8aa25bba-4193-480c-91ab-2dd659103e99","Type":"ContainerDied","Data":"22140576d0578e7b1e5bab75cbdfac54ec86bf9868d1df5dec99a0cd03ad99f1"} Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.549328 5036 scope.go:117] "RemoveContainer" containerID="0f91198ab17d13bdb28b67a429899a2b18f71ee6fd58b3b2788616f1b0f3eefd" Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.550593 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b","Type":"ContainerStarted","Data":"2643f7ffad52ccd6c2f00b620f32ab6a5bb9a54472e84e9131b58bd11b5a4800"} Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.592216 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-69f754595d-jrtgk"] Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.593248 5036 scope.go:117] "RemoveContainer" containerID="6c683e6434ac45501466f3d13add3a1d1d76e1583297689d8eb4362dd106c3db" Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.599896 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-69f754595d-jrtgk"] Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.614467 5036 scope.go:117] "RemoveContainer" containerID="0f91198ab17d13bdb28b67a429899a2b18f71ee6fd58b3b2788616f1b0f3eefd" Jan 10 16:44:52 crc kubenswrapper[5036]: E0110 16:44:52.614937 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f91198ab17d13bdb28b67a429899a2b18f71ee6fd58b3b2788616f1b0f3eefd\": container with ID starting with 0f91198ab17d13bdb28b67a429899a2b18f71ee6fd58b3b2788616f1b0f3eefd not found: ID does not exist" containerID="0f91198ab17d13bdb28b67a429899a2b18f71ee6fd58b3b2788616f1b0f3eefd" Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.614969 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f91198ab17d13bdb28b67a429899a2b18f71ee6fd58b3b2788616f1b0f3eefd"} err="failed to get container status \"0f91198ab17d13bdb28b67a429899a2b18f71ee6fd58b3b2788616f1b0f3eefd\": rpc error: code = NotFound desc = could not find container \"0f91198ab17d13bdb28b67a429899a2b18f71ee6fd58b3b2788616f1b0f3eefd\": container with ID starting with 0f91198ab17d13bdb28b67a429899a2b18f71ee6fd58b3b2788616f1b0f3eefd not found: ID does not exist" Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.614989 5036 scope.go:117] "RemoveContainer" containerID="6c683e6434ac45501466f3d13add3a1d1d76e1583297689d8eb4362dd106c3db" Jan 10 16:44:52 crc kubenswrapper[5036]: E0110 16:44:52.615793 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c683e6434ac45501466f3d13add3a1d1d76e1583297689d8eb4362dd106c3db\": container with ID starting with 6c683e6434ac45501466f3d13add3a1d1d76e1583297689d8eb4362dd106c3db not found: ID does not exist" containerID="6c683e6434ac45501466f3d13add3a1d1d76e1583297689d8eb4362dd106c3db" Jan 10 16:44:52 crc kubenswrapper[5036]: I0110 16:44:52.615818 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c683e6434ac45501466f3d13add3a1d1d76e1583297689d8eb4362dd106c3db"} err="failed to get container status \"6c683e6434ac45501466f3d13add3a1d1d76e1583297689d8eb4362dd106c3db\": rpc error: code = NotFound desc = could not find container \"6c683e6434ac45501466f3d13add3a1d1d76e1583297689d8eb4362dd106c3db\": container with ID starting with 6c683e6434ac45501466f3d13add3a1d1d76e1583297689d8eb4362dd106c3db not found: ID does not exist" Jan 10 16:44:53 crc kubenswrapper[5036]: I0110 16:44:53.562231 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"04bfc371-7aba-4a4d-b018-4a79ad8a0b7b","Type":"ContainerStarted","Data":"082a2e5e7ff0ca80cd47cd4c97d130fb922eb7c7c1475ec443e6f269e70b892f"} Jan 10 16:44:53 crc kubenswrapper[5036]: I0110 16:44:53.562816 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 10 16:44:54 crc kubenswrapper[5036]: I0110 16:44:54.221833 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6ffbbc4bd-swcjc" Jan 10 16:44:54 crc kubenswrapper[5036]: I0110 16:44:54.247730 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.247707758 podStartE2EDuration="4.247707758s" podCreationTimestamp="2026-01-10 16:44:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:44:53.601121312 +0000 UTC m=+1015.471356826" watchObservedRunningTime="2026-01-10 16:44:54.247707758 +0000 UTC m=+1016.117943252" Jan 10 16:44:54 crc kubenswrapper[5036]: I0110 16:44:54.518949 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8aa25bba-4193-480c-91ab-2dd659103e99" path="/var/lib/kubelet/pods/8aa25bba-4193-480c-91ab-2dd659103e99/volumes" Jan 10 16:44:55 crc kubenswrapper[5036]: I0110 16:44:55.275370 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-547c4cc84d-fr8g2" Jan 10 16:44:55 crc kubenswrapper[5036]: I0110 16:44:55.904579 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 16:44:55 crc kubenswrapper[5036]: I0110 16:44:55.904639 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 16:44:56 crc kubenswrapper[5036]: I0110 16:44:56.073066 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" Jan 10 16:44:56 crc kubenswrapper[5036]: I0110 16:44:56.144910 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-hzq7n"] Jan 10 16:44:56 crc kubenswrapper[5036]: I0110 16:44:56.145379 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" podUID="d0567814-352b-4f05-8175-a103c0f98d0b" containerName="dnsmasq-dns" containerID="cri-o://efcc472b386dfdcb34147901510f490a00d52d9f868ac42d620fe9dcdc7f83ae" gracePeriod=10 Jan 10 16:44:56 crc kubenswrapper[5036]: I0110 16:44:56.314882 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 10 16:44:56 crc kubenswrapper[5036]: I0110 16:44:56.360750 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 10 16:44:56 crc kubenswrapper[5036]: I0110 16:44:56.594026 5036 generic.go:334] "Generic (PLEG): container finished" podID="d0567814-352b-4f05-8175-a103c0f98d0b" containerID="efcc472b386dfdcb34147901510f490a00d52d9f868ac42d620fe9dcdc7f83ae" exitCode=0 Jan 10 16:44:56 crc kubenswrapper[5036]: I0110 16:44:56.594089 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" event={"ID":"d0567814-352b-4f05-8175-a103c0f98d0b","Type":"ContainerDied","Data":"efcc472b386dfdcb34147901510f490a00d52d9f868ac42d620fe9dcdc7f83ae"} Jan 10 16:44:56 crc kubenswrapper[5036]: I0110 16:44:56.594224 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6f51970d-278c-486c-bb97-000949f83751" containerName="cinder-scheduler" containerID="cri-o://3b2fd17cd241fee7780cfbf86a0669532974a0573fca26c8f423f0cb9be2451d" gracePeriod=30 Jan 10 16:44:56 crc kubenswrapper[5036]: I0110 16:44:56.594315 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6f51970d-278c-486c-bb97-000949f83751" containerName="probe" containerID="cri-o://0b1478aa7a2900fc01315b10adacd108c9cd7423c88c1d22c5fe80b9aeb93d00" gracePeriod=30 Jan 10 16:44:56 crc kubenswrapper[5036]: I0110 16:44:56.994065 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6f7c9c789b-dj95d" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.197830 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.324463 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0567814-352b-4f05-8175-a103c0f98d0b-dns-svc\") pod \"d0567814-352b-4f05-8175-a103c0f98d0b\" (UID: \"d0567814-352b-4f05-8175-a103c0f98d0b\") " Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.324618 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0567814-352b-4f05-8175-a103c0f98d0b-ovsdbserver-sb\") pod \"d0567814-352b-4f05-8175-a103c0f98d0b\" (UID: \"d0567814-352b-4f05-8175-a103c0f98d0b\") " Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.324697 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hl6p\" (UniqueName: \"kubernetes.io/projected/d0567814-352b-4f05-8175-a103c0f98d0b-kube-api-access-4hl6p\") pod \"d0567814-352b-4f05-8175-a103c0f98d0b\" (UID: \"d0567814-352b-4f05-8175-a103c0f98d0b\") " Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.324725 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0567814-352b-4f05-8175-a103c0f98d0b-config\") pod \"d0567814-352b-4f05-8175-a103c0f98d0b\" (UID: \"d0567814-352b-4f05-8175-a103c0f98d0b\") " Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.324753 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0567814-352b-4f05-8175-a103c0f98d0b-ovsdbserver-nb\") pod \"d0567814-352b-4f05-8175-a103c0f98d0b\" (UID: \"d0567814-352b-4f05-8175-a103c0f98d0b\") " Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.341589 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0567814-352b-4f05-8175-a103c0f98d0b-kube-api-access-4hl6p" (OuterVolumeSpecName: "kube-api-access-4hl6p") pod "d0567814-352b-4f05-8175-a103c0f98d0b" (UID: "d0567814-352b-4f05-8175-a103c0f98d0b"). InnerVolumeSpecName "kube-api-access-4hl6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.376894 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 10 16:44:57 crc kubenswrapper[5036]: E0110 16:44:57.377378 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aa25bba-4193-480c-91ab-2dd659103e99" containerName="barbican-api-log" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.377402 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aa25bba-4193-480c-91ab-2dd659103e99" containerName="barbican-api-log" Jan 10 16:44:57 crc kubenswrapper[5036]: E0110 16:44:57.377431 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aa25bba-4193-480c-91ab-2dd659103e99" containerName="barbican-api" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.377440 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aa25bba-4193-480c-91ab-2dd659103e99" containerName="barbican-api" Jan 10 16:44:57 crc kubenswrapper[5036]: E0110 16:44:57.378056 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0567814-352b-4f05-8175-a103c0f98d0b" containerName="dnsmasq-dns" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.378071 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0567814-352b-4f05-8175-a103c0f98d0b" containerName="dnsmasq-dns" Jan 10 16:44:57 crc kubenswrapper[5036]: E0110 16:44:57.378083 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0567814-352b-4f05-8175-a103c0f98d0b" containerName="init" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.378091 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0567814-352b-4f05-8175-a103c0f98d0b" containerName="init" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.378236 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aa25bba-4193-480c-91ab-2dd659103e99" containerName="barbican-api-log" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.378254 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aa25bba-4193-480c-91ab-2dd659103e99" containerName="barbican-api" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.378265 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0567814-352b-4f05-8175-a103c0f98d0b" containerName="dnsmasq-dns" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.379012 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.382040 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.384176 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.384547 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-wm69c" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.387742 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0567814-352b-4f05-8175-a103c0f98d0b-config" (OuterVolumeSpecName: "config") pod "d0567814-352b-4f05-8175-a103c0f98d0b" (UID: "d0567814-352b-4f05-8175-a103c0f98d0b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.388406 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0567814-352b-4f05-8175-a103c0f98d0b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d0567814-352b-4f05-8175-a103c0f98d0b" (UID: "d0567814-352b-4f05-8175-a103c0f98d0b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.388853 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0567814-352b-4f05-8175-a103c0f98d0b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d0567814-352b-4f05-8175-a103c0f98d0b" (UID: "d0567814-352b-4f05-8175-a103c0f98d0b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.389200 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0567814-352b-4f05-8175-a103c0f98d0b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d0567814-352b-4f05-8175-a103c0f98d0b" (UID: "d0567814-352b-4f05-8175-a103c0f98d0b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.399730 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.426447 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/70cfbefa-2928-4ca5-aa74-93fb1b4cd059-openstack-config\") pod \"openstackclient\" (UID: \"70cfbefa-2928-4ca5-aa74-93fb1b4cd059\") " pod="openstack/openstackclient" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.426504 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/70cfbefa-2928-4ca5-aa74-93fb1b4cd059-openstack-config-secret\") pod \"openstackclient\" (UID: \"70cfbefa-2928-4ca5-aa74-93fb1b4cd059\") " pod="openstack/openstackclient" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.426549 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg95q\" (UniqueName: \"kubernetes.io/projected/70cfbefa-2928-4ca5-aa74-93fb1b4cd059-kube-api-access-dg95q\") pod \"openstackclient\" (UID: \"70cfbefa-2928-4ca5-aa74-93fb1b4cd059\") " pod="openstack/openstackclient" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.426657 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70cfbefa-2928-4ca5-aa74-93fb1b4cd059-combined-ca-bundle\") pod \"openstackclient\" (UID: \"70cfbefa-2928-4ca5-aa74-93fb1b4cd059\") " pod="openstack/openstackclient" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.426776 5036 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0567814-352b-4f05-8175-a103c0f98d0b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.426799 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hl6p\" (UniqueName: \"kubernetes.io/projected/d0567814-352b-4f05-8175-a103c0f98d0b-kube-api-access-4hl6p\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.426813 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0567814-352b-4f05-8175-a103c0f98d0b-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.426826 5036 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0567814-352b-4f05-8175-a103c0f98d0b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.426838 5036 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0567814-352b-4f05-8175-a103c0f98d0b-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.528562 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/70cfbefa-2928-4ca5-aa74-93fb1b4cd059-openstack-config\") pod \"openstackclient\" (UID: \"70cfbefa-2928-4ca5-aa74-93fb1b4cd059\") " pod="openstack/openstackclient" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.528595 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/70cfbefa-2928-4ca5-aa74-93fb1b4cd059-openstack-config-secret\") pod \"openstackclient\" (UID: \"70cfbefa-2928-4ca5-aa74-93fb1b4cd059\") " pod="openstack/openstackclient" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.528624 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg95q\" (UniqueName: \"kubernetes.io/projected/70cfbefa-2928-4ca5-aa74-93fb1b4cd059-kube-api-access-dg95q\") pod \"openstackclient\" (UID: \"70cfbefa-2928-4ca5-aa74-93fb1b4cd059\") " pod="openstack/openstackclient" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.528694 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70cfbefa-2928-4ca5-aa74-93fb1b4cd059-combined-ca-bundle\") pod \"openstackclient\" (UID: \"70cfbefa-2928-4ca5-aa74-93fb1b4cd059\") " pod="openstack/openstackclient" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.530055 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/70cfbefa-2928-4ca5-aa74-93fb1b4cd059-openstack-config\") pod \"openstackclient\" (UID: \"70cfbefa-2928-4ca5-aa74-93fb1b4cd059\") " pod="openstack/openstackclient" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.532611 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70cfbefa-2928-4ca5-aa74-93fb1b4cd059-combined-ca-bundle\") pod \"openstackclient\" (UID: \"70cfbefa-2928-4ca5-aa74-93fb1b4cd059\") " pod="openstack/openstackclient" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.533261 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/70cfbefa-2928-4ca5-aa74-93fb1b4cd059-openstack-config-secret\") pod \"openstackclient\" (UID: \"70cfbefa-2928-4ca5-aa74-93fb1b4cd059\") " pod="openstack/openstackclient" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.549308 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg95q\" (UniqueName: \"kubernetes.io/projected/70cfbefa-2928-4ca5-aa74-93fb1b4cd059-kube-api-access-dg95q\") pod \"openstackclient\" (UID: \"70cfbefa-2928-4ca5-aa74-93fb1b4cd059\") " pod="openstack/openstackclient" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.605264 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" event={"ID":"d0567814-352b-4f05-8175-a103c0f98d0b","Type":"ContainerDied","Data":"b31887c4a762197913f6ddce21d7ecb35ed67b9af1144ec42be6b75a2002d8a5"} Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.605346 5036 scope.go:117] "RemoveContainer" containerID="efcc472b386dfdcb34147901510f490a00d52d9f868ac42d620fe9dcdc7f83ae" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.605545 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-hzq7n" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.609129 5036 generic.go:334] "Generic (PLEG): container finished" podID="6f51970d-278c-486c-bb97-000949f83751" containerID="0b1478aa7a2900fc01315b10adacd108c9cd7423c88c1d22c5fe80b9aeb93d00" exitCode=0 Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.609167 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6f51970d-278c-486c-bb97-000949f83751","Type":"ContainerDied","Data":"0b1478aa7a2900fc01315b10adacd108c9cd7423c88c1d22c5fe80b9aeb93d00"} Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.647491 5036 scope.go:117] "RemoveContainer" containerID="f25d9f065af64361444059eed7131df34be5677f2136d35ea3559d87cc758371" Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.654583 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-hzq7n"] Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.664512 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-hzq7n"] Jan 10 16:44:57 crc kubenswrapper[5036]: I0110 16:44:57.763606 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 10 16:44:58 crc kubenswrapper[5036]: I0110 16:44:58.326526 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 10 16:44:58 crc kubenswrapper[5036]: I0110 16:44:58.516966 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0567814-352b-4f05-8175-a103c0f98d0b" path="/var/lib/kubelet/pods/d0567814-352b-4f05-8175-a103c0f98d0b/volumes" Jan 10 16:44:58 crc kubenswrapper[5036]: I0110 16:44:58.624696 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"70cfbefa-2928-4ca5-aa74-93fb1b4cd059","Type":"ContainerStarted","Data":"3aaa167f4e73992e64e1d92e16d6f502fb95c2ee94b673301426ea69a3b4b07d"} Jan 10 16:44:58 crc kubenswrapper[5036]: I0110 16:44:58.824301 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-74d5fd97c9-96pjx" Jan 10 16:44:58 crc kubenswrapper[5036]: I0110 16:44:58.889479 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-547c4cc84d-fr8g2"] Jan 10 16:44:58 crc kubenswrapper[5036]: I0110 16:44:58.889765 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-547c4cc84d-fr8g2" podUID="58348536-72b1-4f0f-b836-6ff265673fa0" containerName="neutron-api" containerID="cri-o://9c50d3496b2756f1c362e47205a150378df5f2ba11fe1c0cd887ab6068044258" gracePeriod=30 Jan 10 16:44:58 crc kubenswrapper[5036]: I0110 16:44:58.889897 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-547c4cc84d-fr8g2" podUID="58348536-72b1-4f0f-b836-6ff265673fa0" containerName="neutron-httpd" containerID="cri-o://d7c2f571918cef4b3224e7237afd1f22cdfacf8af387df5e4f12b31241286bc2" gracePeriod=30 Jan 10 16:44:59 crc kubenswrapper[5036]: I0110 16:44:59.639599 5036 generic.go:334] "Generic (PLEG): container finished" podID="58348536-72b1-4f0f-b836-6ff265673fa0" containerID="d7c2f571918cef4b3224e7237afd1f22cdfacf8af387df5e4f12b31241286bc2" exitCode=0 Jan 10 16:44:59 crc kubenswrapper[5036]: I0110 16:44:59.639643 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-547c4cc84d-fr8g2" event={"ID":"58348536-72b1-4f0f-b836-6ff265673fa0","Type":"ContainerDied","Data":"d7c2f571918cef4b3224e7237afd1f22cdfacf8af387df5e4f12b31241286bc2"} Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.158108 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29467725-5xzc2"] Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.159465 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29467725-5xzc2" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.161371 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.161642 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.168732 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29467725-5xzc2"] Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.249068 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.279797 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgqc9\" (UniqueName: \"kubernetes.io/projected/7d1b58ad-b491-4354-a0b0-3ab868370dc9-kube-api-access-cgqc9\") pod \"collect-profiles-29467725-5xzc2\" (UID: \"7d1b58ad-b491-4354-a0b0-3ab868370dc9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467725-5xzc2" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.279895 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d1b58ad-b491-4354-a0b0-3ab868370dc9-config-volume\") pod \"collect-profiles-29467725-5xzc2\" (UID: \"7d1b58ad-b491-4354-a0b0-3ab868370dc9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467725-5xzc2" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.280184 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d1b58ad-b491-4354-a0b0-3ab868370dc9-secret-volume\") pod \"collect-profiles-29467725-5xzc2\" (UID: \"7d1b58ad-b491-4354-a0b0-3ab868370dc9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467725-5xzc2" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.383341 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f51970d-278c-486c-bb97-000949f83751-scripts\") pod \"6f51970d-278c-486c-bb97-000949f83751\" (UID: \"6f51970d-278c-486c-bb97-000949f83751\") " Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.383394 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qjnd\" (UniqueName: \"kubernetes.io/projected/6f51970d-278c-486c-bb97-000949f83751-kube-api-access-6qjnd\") pod \"6f51970d-278c-486c-bb97-000949f83751\" (UID: \"6f51970d-278c-486c-bb97-000949f83751\") " Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.383455 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f51970d-278c-486c-bb97-000949f83751-etc-machine-id\") pod \"6f51970d-278c-486c-bb97-000949f83751\" (UID: \"6f51970d-278c-486c-bb97-000949f83751\") " Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.383535 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f51970d-278c-486c-bb97-000949f83751-combined-ca-bundle\") pod \"6f51970d-278c-486c-bb97-000949f83751\" (UID: \"6f51970d-278c-486c-bb97-000949f83751\") " Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.383571 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f51970d-278c-486c-bb97-000949f83751-config-data-custom\") pod \"6f51970d-278c-486c-bb97-000949f83751\" (UID: \"6f51970d-278c-486c-bb97-000949f83751\") " Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.383637 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f51970d-278c-486c-bb97-000949f83751-config-data\") pod \"6f51970d-278c-486c-bb97-000949f83751\" (UID: \"6f51970d-278c-486c-bb97-000949f83751\") " Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.383978 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d1b58ad-b491-4354-a0b0-3ab868370dc9-secret-volume\") pod \"collect-profiles-29467725-5xzc2\" (UID: \"7d1b58ad-b491-4354-a0b0-3ab868370dc9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467725-5xzc2" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.384068 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgqc9\" (UniqueName: \"kubernetes.io/projected/7d1b58ad-b491-4354-a0b0-3ab868370dc9-kube-api-access-cgqc9\") pod \"collect-profiles-29467725-5xzc2\" (UID: \"7d1b58ad-b491-4354-a0b0-3ab868370dc9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467725-5xzc2" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.384138 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d1b58ad-b491-4354-a0b0-3ab868370dc9-config-volume\") pod \"collect-profiles-29467725-5xzc2\" (UID: \"7d1b58ad-b491-4354-a0b0-3ab868370dc9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467725-5xzc2" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.385041 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d1b58ad-b491-4354-a0b0-3ab868370dc9-config-volume\") pod \"collect-profiles-29467725-5xzc2\" (UID: \"7d1b58ad-b491-4354-a0b0-3ab868370dc9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467725-5xzc2" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.387750 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f51970d-278c-486c-bb97-000949f83751-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6f51970d-278c-486c-bb97-000949f83751" (UID: "6f51970d-278c-486c-bb97-000949f83751"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.393364 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f51970d-278c-486c-bb97-000949f83751-scripts" (OuterVolumeSpecName: "scripts") pod "6f51970d-278c-486c-bb97-000949f83751" (UID: "6f51970d-278c-486c-bb97-000949f83751"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.393462 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f51970d-278c-486c-bb97-000949f83751-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6f51970d-278c-486c-bb97-000949f83751" (UID: "6f51970d-278c-486c-bb97-000949f83751"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.394014 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d1b58ad-b491-4354-a0b0-3ab868370dc9-secret-volume\") pod \"collect-profiles-29467725-5xzc2\" (UID: \"7d1b58ad-b491-4354-a0b0-3ab868370dc9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467725-5xzc2" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.407496 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f51970d-278c-486c-bb97-000949f83751-kube-api-access-6qjnd" (OuterVolumeSpecName: "kube-api-access-6qjnd") pod "6f51970d-278c-486c-bb97-000949f83751" (UID: "6f51970d-278c-486c-bb97-000949f83751"). InnerVolumeSpecName "kube-api-access-6qjnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.415558 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgqc9\" (UniqueName: \"kubernetes.io/projected/7d1b58ad-b491-4354-a0b0-3ab868370dc9-kube-api-access-cgqc9\") pod \"collect-profiles-29467725-5xzc2\" (UID: \"7d1b58ad-b491-4354-a0b0-3ab868370dc9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467725-5xzc2" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.443419 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f51970d-278c-486c-bb97-000949f83751-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f51970d-278c-486c-bb97-000949f83751" (UID: "6f51970d-278c-486c-bb97-000949f83751"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.488428 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f51970d-278c-486c-bb97-000949f83751-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.488462 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qjnd\" (UniqueName: \"kubernetes.io/projected/6f51970d-278c-486c-bb97-000949f83751-kube-api-access-6qjnd\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.488476 5036 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f51970d-278c-486c-bb97-000949f83751-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.488484 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f51970d-278c-486c-bb97-000949f83751-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.488493 5036 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f51970d-278c-486c-bb97-000949f83751-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.555061 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29467725-5xzc2" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.565809 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f51970d-278c-486c-bb97-000949f83751-config-data" (OuterVolumeSpecName: "config-data") pod "6f51970d-278c-486c-bb97-000949f83751" (UID: "6f51970d-278c-486c-bb97-000949f83751"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.590197 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f51970d-278c-486c-bb97-000949f83751-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.653795 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.653820 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6f51970d-278c-486c-bb97-000949f83751","Type":"ContainerDied","Data":"3b2fd17cd241fee7780cfbf86a0669532974a0573fca26c8f423f0cb9be2451d"} Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.653870 5036 scope.go:117] "RemoveContainer" containerID="0b1478aa7a2900fc01315b10adacd108c9cd7423c88c1d22c5fe80b9aeb93d00" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.653787 5036 generic.go:334] "Generic (PLEG): container finished" podID="6f51970d-278c-486c-bb97-000949f83751" containerID="3b2fd17cd241fee7780cfbf86a0669532974a0573fca26c8f423f0cb9be2451d" exitCode=0 Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.653996 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6f51970d-278c-486c-bb97-000949f83751","Type":"ContainerDied","Data":"38dbf409bcb488b4aa2ba34222e30cde6c7637192557ac1df28b13ffb7088194"} Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.697148 5036 scope.go:117] "RemoveContainer" containerID="3b2fd17cd241fee7780cfbf86a0669532974a0573fca26c8f423f0cb9be2451d" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.712244 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.761367 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.783628 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 10 16:45:00 crc kubenswrapper[5036]: E0110 16:45:00.785050 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f51970d-278c-486c-bb97-000949f83751" containerName="cinder-scheduler" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.785085 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f51970d-278c-486c-bb97-000949f83751" containerName="cinder-scheduler" Jan 10 16:45:00 crc kubenswrapper[5036]: E0110 16:45:00.785121 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f51970d-278c-486c-bb97-000949f83751" containerName="probe" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.785128 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f51970d-278c-486c-bb97-000949f83751" containerName="probe" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.785355 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f51970d-278c-486c-bb97-000949f83751" containerName="probe" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.785378 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f51970d-278c-486c-bb97-000949f83751" containerName="cinder-scheduler" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.786918 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.798430 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.800852 5036 scope.go:117] "RemoveContainer" containerID="0b1478aa7a2900fc01315b10adacd108c9cd7423c88c1d22c5fe80b9aeb93d00" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.804552 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 10 16:45:00 crc kubenswrapper[5036]: E0110 16:45:00.808891 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b1478aa7a2900fc01315b10adacd108c9cd7423c88c1d22c5fe80b9aeb93d00\": container with ID starting with 0b1478aa7a2900fc01315b10adacd108c9cd7423c88c1d22c5fe80b9aeb93d00 not found: ID does not exist" containerID="0b1478aa7a2900fc01315b10adacd108c9cd7423c88c1d22c5fe80b9aeb93d00" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.808926 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b1478aa7a2900fc01315b10adacd108c9cd7423c88c1d22c5fe80b9aeb93d00"} err="failed to get container status \"0b1478aa7a2900fc01315b10adacd108c9cd7423c88c1d22c5fe80b9aeb93d00\": rpc error: code = NotFound desc = could not find container \"0b1478aa7a2900fc01315b10adacd108c9cd7423c88c1d22c5fe80b9aeb93d00\": container with ID starting with 0b1478aa7a2900fc01315b10adacd108c9cd7423c88c1d22c5fe80b9aeb93d00 not found: ID does not exist" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.808956 5036 scope.go:117] "RemoveContainer" containerID="3b2fd17cd241fee7780cfbf86a0669532974a0573fca26c8f423f0cb9be2451d" Jan 10 16:45:00 crc kubenswrapper[5036]: E0110 16:45:00.816791 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b2fd17cd241fee7780cfbf86a0669532974a0573fca26c8f423f0cb9be2451d\": container with ID starting with 3b2fd17cd241fee7780cfbf86a0669532974a0573fca26c8f423f0cb9be2451d not found: ID does not exist" containerID="3b2fd17cd241fee7780cfbf86a0669532974a0573fca26c8f423f0cb9be2451d" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.816834 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2fd17cd241fee7780cfbf86a0669532974a0573fca26c8f423f0cb9be2451d"} err="failed to get container status \"3b2fd17cd241fee7780cfbf86a0669532974a0573fca26c8f423f0cb9be2451d\": rpc error: code = NotFound desc = could not find container \"3b2fd17cd241fee7780cfbf86a0669532974a0573fca26c8f423f0cb9be2451d\": container with ID starting with 3b2fd17cd241fee7780cfbf86a0669532974a0573fca26c8f423f0cb9be2451d not found: ID does not exist" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.854012 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29467725-5xzc2"] Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.899877 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db9849cf-82c8-4f9d-86f2-c7bf664528c9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"db9849cf-82c8-4f9d-86f2-c7bf664528c9\") " pod="openstack/cinder-scheduler-0" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.899959 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db9849cf-82c8-4f9d-86f2-c7bf664528c9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"db9849cf-82c8-4f9d-86f2-c7bf664528c9\") " pod="openstack/cinder-scheduler-0" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.900005 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db9849cf-82c8-4f9d-86f2-c7bf664528c9-scripts\") pod \"cinder-scheduler-0\" (UID: \"db9849cf-82c8-4f9d-86f2-c7bf664528c9\") " pod="openstack/cinder-scheduler-0" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.900037 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db9849cf-82c8-4f9d-86f2-c7bf664528c9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"db9849cf-82c8-4f9d-86f2-c7bf664528c9\") " pod="openstack/cinder-scheduler-0" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.900088 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db9849cf-82c8-4f9d-86f2-c7bf664528c9-config-data\") pod \"cinder-scheduler-0\" (UID: \"db9849cf-82c8-4f9d-86f2-c7bf664528c9\") " pod="openstack/cinder-scheduler-0" Jan 10 16:45:00 crc kubenswrapper[5036]: I0110 16:45:00.900104 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q876m\" (UniqueName: \"kubernetes.io/projected/db9849cf-82c8-4f9d-86f2-c7bf664528c9-kube-api-access-q876m\") pod \"cinder-scheduler-0\" (UID: \"db9849cf-82c8-4f9d-86f2-c7bf664528c9\") " pod="openstack/cinder-scheduler-0" Jan 10 16:45:01 crc kubenswrapper[5036]: I0110 16:45:01.001576 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db9849cf-82c8-4f9d-86f2-c7bf664528c9-config-data\") pod \"cinder-scheduler-0\" (UID: \"db9849cf-82c8-4f9d-86f2-c7bf664528c9\") " pod="openstack/cinder-scheduler-0" Jan 10 16:45:01 crc kubenswrapper[5036]: I0110 16:45:01.001618 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q876m\" (UniqueName: \"kubernetes.io/projected/db9849cf-82c8-4f9d-86f2-c7bf664528c9-kube-api-access-q876m\") pod \"cinder-scheduler-0\" (UID: \"db9849cf-82c8-4f9d-86f2-c7bf664528c9\") " pod="openstack/cinder-scheduler-0" Jan 10 16:45:01 crc kubenswrapper[5036]: I0110 16:45:01.001661 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db9849cf-82c8-4f9d-86f2-c7bf664528c9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"db9849cf-82c8-4f9d-86f2-c7bf664528c9\") " pod="openstack/cinder-scheduler-0" Jan 10 16:45:01 crc kubenswrapper[5036]: I0110 16:45:01.001725 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db9849cf-82c8-4f9d-86f2-c7bf664528c9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"db9849cf-82c8-4f9d-86f2-c7bf664528c9\") " pod="openstack/cinder-scheduler-0" Jan 10 16:45:01 crc kubenswrapper[5036]: I0110 16:45:01.001765 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db9849cf-82c8-4f9d-86f2-c7bf664528c9-scripts\") pod \"cinder-scheduler-0\" (UID: \"db9849cf-82c8-4f9d-86f2-c7bf664528c9\") " pod="openstack/cinder-scheduler-0" Jan 10 16:45:01 crc kubenswrapper[5036]: I0110 16:45:01.001793 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db9849cf-82c8-4f9d-86f2-c7bf664528c9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"db9849cf-82c8-4f9d-86f2-c7bf664528c9\") " pod="openstack/cinder-scheduler-0" Jan 10 16:45:01 crc kubenswrapper[5036]: I0110 16:45:01.002509 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/db9849cf-82c8-4f9d-86f2-c7bf664528c9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"db9849cf-82c8-4f9d-86f2-c7bf664528c9\") " pod="openstack/cinder-scheduler-0" Jan 10 16:45:01 crc kubenswrapper[5036]: I0110 16:45:01.009249 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db9849cf-82c8-4f9d-86f2-c7bf664528c9-scripts\") pod \"cinder-scheduler-0\" (UID: \"db9849cf-82c8-4f9d-86f2-c7bf664528c9\") " pod="openstack/cinder-scheduler-0" Jan 10 16:45:01 crc kubenswrapper[5036]: I0110 16:45:01.009706 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db9849cf-82c8-4f9d-86f2-c7bf664528c9-config-data\") pod \"cinder-scheduler-0\" (UID: \"db9849cf-82c8-4f9d-86f2-c7bf664528c9\") " pod="openstack/cinder-scheduler-0" Jan 10 16:45:01 crc kubenswrapper[5036]: I0110 16:45:01.009748 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/db9849cf-82c8-4f9d-86f2-c7bf664528c9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"db9849cf-82c8-4f9d-86f2-c7bf664528c9\") " pod="openstack/cinder-scheduler-0" Jan 10 16:45:01 crc kubenswrapper[5036]: I0110 16:45:01.014321 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db9849cf-82c8-4f9d-86f2-c7bf664528c9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"db9849cf-82c8-4f9d-86f2-c7bf664528c9\") " pod="openstack/cinder-scheduler-0" Jan 10 16:45:01 crc kubenswrapper[5036]: I0110 16:45:01.027583 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q876m\" (UniqueName: \"kubernetes.io/projected/db9849cf-82c8-4f9d-86f2-c7bf664528c9-kube-api-access-q876m\") pod \"cinder-scheduler-0\" (UID: \"db9849cf-82c8-4f9d-86f2-c7bf664528c9\") " pod="openstack/cinder-scheduler-0" Jan 10 16:45:01 crc kubenswrapper[5036]: I0110 16:45:01.125672 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 10 16:45:01 crc kubenswrapper[5036]: I0110 16:45:01.644955 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 10 16:45:01 crc kubenswrapper[5036]: W0110 16:45:01.650828 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb9849cf_82c8_4f9d_86f2_c7bf664528c9.slice/crio-a1c1f74f5ace8d2d16eefcd2ccbdac7f54b91306ed82a7e60ab077f6b001f347 WatchSource:0}: Error finding container a1c1f74f5ace8d2d16eefcd2ccbdac7f54b91306ed82a7e60ab077f6b001f347: Status 404 returned error can't find the container with id a1c1f74f5ace8d2d16eefcd2ccbdac7f54b91306ed82a7e60ab077f6b001f347 Jan 10 16:45:01 crc kubenswrapper[5036]: I0110 16:45:01.672222 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29467725-5xzc2" event={"ID":"7d1b58ad-b491-4354-a0b0-3ab868370dc9","Type":"ContainerStarted","Data":"b16a0b2c30b5bc3790ac01f824c82c020f74640e6ba51070b62457f494fd2647"} Jan 10 16:45:01 crc kubenswrapper[5036]: I0110 16:45:01.672271 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29467725-5xzc2" event={"ID":"7d1b58ad-b491-4354-a0b0-3ab868370dc9","Type":"ContainerStarted","Data":"4a71aa26e42592ba8e87f77bcd3cceb2888b1cf39f71b114cbae81109fc61a8b"} Jan 10 16:45:01 crc kubenswrapper[5036]: I0110 16:45:01.673599 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"db9849cf-82c8-4f9d-86f2-c7bf664528c9","Type":"ContainerStarted","Data":"a1c1f74f5ace8d2d16eefcd2ccbdac7f54b91306ed82a7e60ab077f6b001f347"} Jan 10 16:45:02 crc kubenswrapper[5036]: I0110 16:45:02.524270 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f51970d-278c-486c-bb97-000949f83751" path="/var/lib/kubelet/pods/6f51970d-278c-486c-bb97-000949f83751/volumes" Jan 10 16:45:03 crc kubenswrapper[5036]: I0110 16:45:03.467264 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 10 16:45:03 crc kubenswrapper[5036]: I0110 16:45:03.705738 5036 generic.go:334] "Generic (PLEG): container finished" podID="7d1b58ad-b491-4354-a0b0-3ab868370dc9" containerID="b16a0b2c30b5bc3790ac01f824c82c020f74640e6ba51070b62457f494fd2647" exitCode=0 Jan 10 16:45:03 crc kubenswrapper[5036]: I0110 16:45:03.705785 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29467725-5xzc2" event={"ID":"7d1b58ad-b491-4354-a0b0-3ab868370dc9","Type":"ContainerDied","Data":"b16a0b2c30b5bc3790ac01f824c82c020f74640e6ba51070b62457f494fd2647"} Jan 10 16:45:03 crc kubenswrapper[5036]: I0110 16:45:03.739825 5036 generic.go:334] "Generic (PLEG): container finished" podID="58348536-72b1-4f0f-b836-6ff265673fa0" containerID="9c50d3496b2756f1c362e47205a150378df5f2ba11fe1c0cd887ab6068044258" exitCode=0 Jan 10 16:45:03 crc kubenswrapper[5036]: I0110 16:45:03.739904 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-547c4cc84d-fr8g2" event={"ID":"58348536-72b1-4f0f-b836-6ff265673fa0","Type":"ContainerDied","Data":"9c50d3496b2756f1c362e47205a150378df5f2ba11fe1c0cd887ab6068044258"} Jan 10 16:45:03 crc kubenswrapper[5036]: I0110 16:45:03.745335 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"db9849cf-82c8-4f9d-86f2-c7bf664528c9","Type":"ContainerStarted","Data":"97d528303bf9383aa7160670697fa803581f886e722784cfa70f3b2fec64706a"} Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.027409 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-547c4cc84d-fr8g2" Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.157258 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/58348536-72b1-4f0f-b836-6ff265673fa0-httpd-config\") pod \"58348536-72b1-4f0f-b836-6ff265673fa0\" (UID: \"58348536-72b1-4f0f-b836-6ff265673fa0\") " Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.157616 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/58348536-72b1-4f0f-b836-6ff265673fa0-config\") pod \"58348536-72b1-4f0f-b836-6ff265673fa0\" (UID: \"58348536-72b1-4f0f-b836-6ff265673fa0\") " Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.157815 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58348536-72b1-4f0f-b836-6ff265673fa0-ovndb-tls-certs\") pod \"58348536-72b1-4f0f-b836-6ff265673fa0\" (UID: \"58348536-72b1-4f0f-b836-6ff265673fa0\") " Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.157874 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chsqp\" (UniqueName: \"kubernetes.io/projected/58348536-72b1-4f0f-b836-6ff265673fa0-kube-api-access-chsqp\") pod \"58348536-72b1-4f0f-b836-6ff265673fa0\" (UID: \"58348536-72b1-4f0f-b836-6ff265673fa0\") " Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.157904 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58348536-72b1-4f0f-b836-6ff265673fa0-combined-ca-bundle\") pod \"58348536-72b1-4f0f-b836-6ff265673fa0\" (UID: \"58348536-72b1-4f0f-b836-6ff265673fa0\") " Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.164301 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58348536-72b1-4f0f-b836-6ff265673fa0-kube-api-access-chsqp" (OuterVolumeSpecName: "kube-api-access-chsqp") pod "58348536-72b1-4f0f-b836-6ff265673fa0" (UID: "58348536-72b1-4f0f-b836-6ff265673fa0"). InnerVolumeSpecName "kube-api-access-chsqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.165606 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58348536-72b1-4f0f-b836-6ff265673fa0-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "58348536-72b1-4f0f-b836-6ff265673fa0" (UID: "58348536-72b1-4f0f-b836-6ff265673fa0"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.205429 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58348536-72b1-4f0f-b836-6ff265673fa0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58348536-72b1-4f0f-b836-6ff265673fa0" (UID: "58348536-72b1-4f0f-b836-6ff265673fa0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.225864 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58348536-72b1-4f0f-b836-6ff265673fa0-config" (OuterVolumeSpecName: "config") pod "58348536-72b1-4f0f-b836-6ff265673fa0" (UID: "58348536-72b1-4f0f-b836-6ff265673fa0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.236084 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58348536-72b1-4f0f-b836-6ff265673fa0-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "58348536-72b1-4f0f-b836-6ff265673fa0" (UID: "58348536-72b1-4f0f-b836-6ff265673fa0"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.259864 5036 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/58348536-72b1-4f0f-b836-6ff265673fa0-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.259901 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/58348536-72b1-4f0f-b836-6ff265673fa0-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.259913 5036 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58348536-72b1-4f0f-b836-6ff265673fa0-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.259930 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chsqp\" (UniqueName: \"kubernetes.io/projected/58348536-72b1-4f0f-b836-6ff265673fa0-kube-api-access-chsqp\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.259945 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58348536-72b1-4f0f-b836-6ff265673fa0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.797327 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"db9849cf-82c8-4f9d-86f2-c7bf664528c9","Type":"ContainerStarted","Data":"7803f63667de5492f296dc9952f6ae68105eb1f8e734e2c275e8f86e43f442c5"} Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.803656 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-547c4cc84d-fr8g2" event={"ID":"58348536-72b1-4f0f-b836-6ff265673fa0","Type":"ContainerDied","Data":"5bc70539aa69aa7023f653e028042c3ee40d087037fd2fe6d281659ed7a45afa"} Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.803672 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-547c4cc84d-fr8g2" Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.803730 5036 scope.go:117] "RemoveContainer" containerID="d7c2f571918cef4b3224e7237afd1f22cdfacf8af387df5e4f12b31241286bc2" Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.822996 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.822982448 podStartE2EDuration="4.822982448s" podCreationTimestamp="2026-01-10 16:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:45:04.821714032 +0000 UTC m=+1026.691949526" watchObservedRunningTime="2026-01-10 16:45:04.822982448 +0000 UTC m=+1026.693217942" Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.846537 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-547c4cc84d-fr8g2"] Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.854965 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-547c4cc84d-fr8g2"] Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.881358 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-sz6zr"] Jan 10 16:45:04 crc kubenswrapper[5036]: E0110 16:45:04.881693 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58348536-72b1-4f0f-b836-6ff265673fa0" containerName="neutron-api" Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.881708 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="58348536-72b1-4f0f-b836-6ff265673fa0" containerName="neutron-api" Jan 10 16:45:04 crc kubenswrapper[5036]: E0110 16:45:04.881743 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58348536-72b1-4f0f-b836-6ff265673fa0" containerName="neutron-httpd" Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.881750 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="58348536-72b1-4f0f-b836-6ff265673fa0" containerName="neutron-httpd" Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.881919 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="58348536-72b1-4f0f-b836-6ff265673fa0" containerName="neutron-api" Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.881943 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="58348536-72b1-4f0f-b836-6ff265673fa0" containerName="neutron-httpd" Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.882419 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-sz6zr" Jan 10 16:45:04 crc kubenswrapper[5036]: I0110 16:45:04.950382 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-sz6zr"] Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.002920 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-d24s2"] Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.004635 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-d24s2" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.006045 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c-operator-scripts\") pod \"nova-api-db-create-sz6zr\" (UID: \"e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c\") " pod="openstack/nova-api-db-create-sz6zr" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.006132 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69rw6\" (UniqueName: \"kubernetes.io/projected/e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c-kube-api-access-69rw6\") pod \"nova-api-db-create-sz6zr\" (UID: \"e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c\") " pod="openstack/nova-api-db-create-sz6zr" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.014760 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-d24s2"] Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.077411 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-d6g85"] Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.078799 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-d6g85" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.100748 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-d6g85"] Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.107066 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69rw6\" (UniqueName: \"kubernetes.io/projected/e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c-kube-api-access-69rw6\") pod \"nova-api-db-create-sz6zr\" (UID: \"e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c\") " pod="openstack/nova-api-db-create-sz6zr" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.107106 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/644223c9-410d-4ce9-b1a7-d6137d46f3cf-operator-scripts\") pod \"nova-cell0-db-create-d24s2\" (UID: \"644223c9-410d-4ce9-b1a7-d6137d46f3cf\") " pod="openstack/nova-cell0-db-create-d24s2" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.107146 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmck9\" (UniqueName: \"kubernetes.io/projected/644223c9-410d-4ce9-b1a7-d6137d46f3cf-kube-api-access-fmck9\") pod \"nova-cell0-db-create-d24s2\" (UID: \"644223c9-410d-4ce9-b1a7-d6137d46f3cf\") " pod="openstack/nova-cell0-db-create-d24s2" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.107189 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c-operator-scripts\") pod \"nova-api-db-create-sz6zr\" (UID: \"e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c\") " pod="openstack/nova-api-db-create-sz6zr" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.107873 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c-operator-scripts\") pod \"nova-api-db-create-sz6zr\" (UID: \"e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c\") " pod="openstack/nova-api-db-create-sz6zr" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.110533 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-6758-account-create-update-mbg5f"] Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.111943 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6758-account-create-update-mbg5f" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.115728 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.142786 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6758-account-create-update-mbg5f"] Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.143249 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69rw6\" (UniqueName: \"kubernetes.io/projected/e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c-kube-api-access-69rw6\") pod \"nova-api-db-create-sz6zr\" (UID: \"e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c\") " pod="openstack/nova-api-db-create-sz6zr" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.213905 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f52d4419-3cc2-47fb-8a1f-5b086a2660a9-operator-scripts\") pod \"nova-cell1-db-create-d6g85\" (UID: \"f52d4419-3cc2-47fb-8a1f-5b086a2660a9\") " pod="openstack/nova-cell1-db-create-d6g85" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.213987 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5135e7f-1fec-4960-ab32-eeb7901e1a4d-operator-scripts\") pod \"nova-api-6758-account-create-update-mbg5f\" (UID: \"d5135e7f-1fec-4960-ab32-eeb7901e1a4d\") " pod="openstack/nova-api-6758-account-create-update-mbg5f" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.214073 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/644223c9-410d-4ce9-b1a7-d6137d46f3cf-operator-scripts\") pod \"nova-cell0-db-create-d24s2\" (UID: \"644223c9-410d-4ce9-b1a7-d6137d46f3cf\") " pod="openstack/nova-cell0-db-create-d24s2" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.214130 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74t4d\" (UniqueName: \"kubernetes.io/projected/d5135e7f-1fec-4960-ab32-eeb7901e1a4d-kube-api-access-74t4d\") pod \"nova-api-6758-account-create-update-mbg5f\" (UID: \"d5135e7f-1fec-4960-ab32-eeb7901e1a4d\") " pod="openstack/nova-api-6758-account-create-update-mbg5f" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.214164 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmck9\" (UniqueName: \"kubernetes.io/projected/644223c9-410d-4ce9-b1a7-d6137d46f3cf-kube-api-access-fmck9\") pod \"nova-cell0-db-create-d24s2\" (UID: \"644223c9-410d-4ce9-b1a7-d6137d46f3cf\") " pod="openstack/nova-cell0-db-create-d24s2" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.214226 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5ddp\" (UniqueName: \"kubernetes.io/projected/f52d4419-3cc2-47fb-8a1f-5b086a2660a9-kube-api-access-d5ddp\") pod \"nova-cell1-db-create-d6g85\" (UID: \"f52d4419-3cc2-47fb-8a1f-5b086a2660a9\") " pod="openstack/nova-cell1-db-create-d6g85" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.215418 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/644223c9-410d-4ce9-b1a7-d6137d46f3cf-operator-scripts\") pod \"nova-cell0-db-create-d24s2\" (UID: \"644223c9-410d-4ce9-b1a7-d6137d46f3cf\") " pod="openstack/nova-cell0-db-create-d24s2" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.230802 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-sz6zr" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.283391 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-b3eb-account-create-update-bln2k"] Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.284953 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b3eb-account-create-update-bln2k" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.288392 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.290270 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b3eb-account-create-update-bln2k"] Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.291478 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmck9\" (UniqueName: \"kubernetes.io/projected/644223c9-410d-4ce9-b1a7-d6137d46f3cf-kube-api-access-fmck9\") pod \"nova-cell0-db-create-d24s2\" (UID: \"644223c9-410d-4ce9-b1a7-d6137d46f3cf\") " pod="openstack/nova-cell0-db-create-d24s2" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.317372 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f52d4419-3cc2-47fb-8a1f-5b086a2660a9-operator-scripts\") pod \"nova-cell1-db-create-d6g85\" (UID: \"f52d4419-3cc2-47fb-8a1f-5b086a2660a9\") " pod="openstack/nova-cell1-db-create-d6g85" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.317441 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5135e7f-1fec-4960-ab32-eeb7901e1a4d-operator-scripts\") pod \"nova-api-6758-account-create-update-mbg5f\" (UID: \"d5135e7f-1fec-4960-ab32-eeb7901e1a4d\") " pod="openstack/nova-api-6758-account-create-update-mbg5f" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.318492 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f52d4419-3cc2-47fb-8a1f-5b086a2660a9-operator-scripts\") pod \"nova-cell1-db-create-d6g85\" (UID: \"f52d4419-3cc2-47fb-8a1f-5b086a2660a9\") " pod="openstack/nova-cell1-db-create-d6g85" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.319510 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74t4d\" (UniqueName: \"kubernetes.io/projected/d5135e7f-1fec-4960-ab32-eeb7901e1a4d-kube-api-access-74t4d\") pod \"nova-api-6758-account-create-update-mbg5f\" (UID: \"d5135e7f-1fec-4960-ab32-eeb7901e1a4d\") " pod="openstack/nova-api-6758-account-create-update-mbg5f" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.319613 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5ddp\" (UniqueName: \"kubernetes.io/projected/f52d4419-3cc2-47fb-8a1f-5b086a2660a9-kube-api-access-d5ddp\") pod \"nova-cell1-db-create-d6g85\" (UID: \"f52d4419-3cc2-47fb-8a1f-5b086a2660a9\") " pod="openstack/nova-cell1-db-create-d6g85" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.320879 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5135e7f-1fec-4960-ab32-eeb7901e1a4d-operator-scripts\") pod \"nova-api-6758-account-create-update-mbg5f\" (UID: \"d5135e7f-1fec-4960-ab32-eeb7901e1a4d\") " pod="openstack/nova-api-6758-account-create-update-mbg5f" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.331053 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-d24s2" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.346431 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74t4d\" (UniqueName: \"kubernetes.io/projected/d5135e7f-1fec-4960-ab32-eeb7901e1a4d-kube-api-access-74t4d\") pod \"nova-api-6758-account-create-update-mbg5f\" (UID: \"d5135e7f-1fec-4960-ab32-eeb7901e1a4d\") " pod="openstack/nova-api-6758-account-create-update-mbg5f" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.369308 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5ddp\" (UniqueName: \"kubernetes.io/projected/f52d4419-3cc2-47fb-8a1f-5b086a2660a9-kube-api-access-d5ddp\") pod \"nova-cell1-db-create-d6g85\" (UID: \"f52d4419-3cc2-47fb-8a1f-5b086a2660a9\") " pod="openstack/nova-cell1-db-create-d6g85" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.397866 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-d6g85" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.421291 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4589fdc9-748c-41e7-ba5b-493750149d60-operator-scripts\") pod \"nova-cell0-b3eb-account-create-update-bln2k\" (UID: \"4589fdc9-748c-41e7-ba5b-493750149d60\") " pod="openstack/nova-cell0-b3eb-account-create-update-bln2k" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.421403 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqbg9\" (UniqueName: \"kubernetes.io/projected/4589fdc9-748c-41e7-ba5b-493750149d60-kube-api-access-rqbg9\") pod \"nova-cell0-b3eb-account-create-update-bln2k\" (UID: \"4589fdc9-748c-41e7-ba5b-493750149d60\") " pod="openstack/nova-cell0-b3eb-account-create-update-bln2k" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.463524 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-1e9d-account-create-update-xf4tj"] Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.464498 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1e9d-account-create-update-xf4tj" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.469105 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.477362 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1e9d-account-create-update-xf4tj"] Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.481415 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6758-account-create-update-mbg5f" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.522737 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56-operator-scripts\") pod \"nova-cell1-1e9d-account-create-update-xf4tj\" (UID: \"e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56\") " pod="openstack/nova-cell1-1e9d-account-create-update-xf4tj" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.522805 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4589fdc9-748c-41e7-ba5b-493750149d60-operator-scripts\") pod \"nova-cell0-b3eb-account-create-update-bln2k\" (UID: \"4589fdc9-748c-41e7-ba5b-493750149d60\") " pod="openstack/nova-cell0-b3eb-account-create-update-bln2k" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.522885 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqbg9\" (UniqueName: \"kubernetes.io/projected/4589fdc9-748c-41e7-ba5b-493750149d60-kube-api-access-rqbg9\") pod \"nova-cell0-b3eb-account-create-update-bln2k\" (UID: \"4589fdc9-748c-41e7-ba5b-493750149d60\") " pod="openstack/nova-cell0-b3eb-account-create-update-bln2k" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.522990 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6pt6\" (UniqueName: \"kubernetes.io/projected/e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56-kube-api-access-v6pt6\") pod \"nova-cell1-1e9d-account-create-update-xf4tj\" (UID: \"e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56\") " pod="openstack/nova-cell1-1e9d-account-create-update-xf4tj" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.523900 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4589fdc9-748c-41e7-ba5b-493750149d60-operator-scripts\") pod \"nova-cell0-b3eb-account-create-update-bln2k\" (UID: \"4589fdc9-748c-41e7-ba5b-493750149d60\") " pod="openstack/nova-cell0-b3eb-account-create-update-bln2k" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.565350 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqbg9\" (UniqueName: \"kubernetes.io/projected/4589fdc9-748c-41e7-ba5b-493750149d60-kube-api-access-rqbg9\") pod \"nova-cell0-b3eb-account-create-update-bln2k\" (UID: \"4589fdc9-748c-41e7-ba5b-493750149d60\") " pod="openstack/nova-cell0-b3eb-account-create-update-bln2k" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.621741 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b3eb-account-create-update-bln2k" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.624915 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6pt6\" (UniqueName: \"kubernetes.io/projected/e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56-kube-api-access-v6pt6\") pod \"nova-cell1-1e9d-account-create-update-xf4tj\" (UID: \"e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56\") " pod="openstack/nova-cell1-1e9d-account-create-update-xf4tj" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.624973 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56-operator-scripts\") pod \"nova-cell1-1e9d-account-create-update-xf4tj\" (UID: \"e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56\") " pod="openstack/nova-cell1-1e9d-account-create-update-xf4tj" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.625815 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56-operator-scripts\") pod \"nova-cell1-1e9d-account-create-update-xf4tj\" (UID: \"e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56\") " pod="openstack/nova-cell1-1e9d-account-create-update-xf4tj" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.653457 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6pt6\" (UniqueName: \"kubernetes.io/projected/e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56-kube-api-access-v6pt6\") pod \"nova-cell1-1e9d-account-create-update-xf4tj\" (UID: \"e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56\") " pod="openstack/nova-cell1-1e9d-account-create-update-xf4tj" Jan 10 16:45:05 crc kubenswrapper[5036]: I0110 16:45:05.796302 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1e9d-account-create-update-xf4tj" Jan 10 16:45:06 crc kubenswrapper[5036]: I0110 16:45:06.126753 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 10 16:45:06 crc kubenswrapper[5036]: I0110 16:45:06.521972 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58348536-72b1-4f0f-b836-6ff265673fa0" path="/var/lib/kubelet/pods/58348536-72b1-4f0f-b836-6ff265673fa0/volumes" Jan 10 16:45:10 crc kubenswrapper[5036]: I0110 16:45:10.284943 5036 scope.go:117] "RemoveContainer" containerID="9c50d3496b2756f1c362e47205a150378df5f2ba11fe1c0cd887ab6068044258" Jan 10 16:45:10 crc kubenswrapper[5036]: I0110 16:45:10.531099 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29467725-5xzc2" Jan 10 16:45:10 crc kubenswrapper[5036]: I0110 16:45:10.609590 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgqc9\" (UniqueName: \"kubernetes.io/projected/7d1b58ad-b491-4354-a0b0-3ab868370dc9-kube-api-access-cgqc9\") pod \"7d1b58ad-b491-4354-a0b0-3ab868370dc9\" (UID: \"7d1b58ad-b491-4354-a0b0-3ab868370dc9\") " Jan 10 16:45:10 crc kubenswrapper[5036]: I0110 16:45:10.609666 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d1b58ad-b491-4354-a0b0-3ab868370dc9-secret-volume\") pod \"7d1b58ad-b491-4354-a0b0-3ab868370dc9\" (UID: \"7d1b58ad-b491-4354-a0b0-3ab868370dc9\") " Jan 10 16:45:10 crc kubenswrapper[5036]: I0110 16:45:10.609815 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d1b58ad-b491-4354-a0b0-3ab868370dc9-config-volume\") pod \"7d1b58ad-b491-4354-a0b0-3ab868370dc9\" (UID: \"7d1b58ad-b491-4354-a0b0-3ab868370dc9\") " Jan 10 16:45:10 crc kubenswrapper[5036]: I0110 16:45:10.617442 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d1b58ad-b491-4354-a0b0-3ab868370dc9-config-volume" (OuterVolumeSpecName: "config-volume") pod "7d1b58ad-b491-4354-a0b0-3ab868370dc9" (UID: "7d1b58ad-b491-4354-a0b0-3ab868370dc9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:45:10 crc kubenswrapper[5036]: I0110 16:45:10.620993 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d1b58ad-b491-4354-a0b0-3ab868370dc9-kube-api-access-cgqc9" (OuterVolumeSpecName: "kube-api-access-cgqc9") pod "7d1b58ad-b491-4354-a0b0-3ab868370dc9" (UID: "7d1b58ad-b491-4354-a0b0-3ab868370dc9"). InnerVolumeSpecName "kube-api-access-cgqc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:45:10 crc kubenswrapper[5036]: I0110 16:45:10.657804 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d1b58ad-b491-4354-a0b0-3ab868370dc9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7d1b58ad-b491-4354-a0b0-3ab868370dc9" (UID: "7d1b58ad-b491-4354-a0b0-3ab868370dc9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:10 crc kubenswrapper[5036]: I0110 16:45:10.716324 5036 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d1b58ad-b491-4354-a0b0-3ab868370dc9-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:10 crc kubenswrapper[5036]: I0110 16:45:10.716366 5036 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d1b58ad-b491-4354-a0b0-3ab868370dc9-config-volume\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:10 crc kubenswrapper[5036]: I0110 16:45:10.716381 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgqc9\" (UniqueName: \"kubernetes.io/projected/7d1b58ad-b491-4354-a0b0-3ab868370dc9-kube-api-access-cgqc9\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:10 crc kubenswrapper[5036]: I0110 16:45:10.874042 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29467725-5xzc2" Jan 10 16:45:10 crc kubenswrapper[5036]: I0110 16:45:10.874035 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29467725-5xzc2" event={"ID":"7d1b58ad-b491-4354-a0b0-3ab868370dc9","Type":"ContainerDied","Data":"4a71aa26e42592ba8e87f77bcd3cceb2888b1cf39f71b114cbae81109fc61a8b"} Jan 10 16:45:10 crc kubenswrapper[5036]: I0110 16:45:10.874511 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a71aa26e42592ba8e87f77bcd3cceb2888b1cf39f71b114cbae81109fc61a8b" Jan 10 16:45:10 crc kubenswrapper[5036]: I0110 16:45:10.893300 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 10 16:45:10 crc kubenswrapper[5036]: I0110 16:45:10.900451 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-d24s2"] Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.140356 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-d6g85"] Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.148585 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b3eb-account-create-update-bln2k"] Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.285397 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1e9d-account-create-update-xf4tj"] Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.330289 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6758-account-create-update-mbg5f"] Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.358381 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-sz6zr"] Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.407136 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.887542 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-d24s2" event={"ID":"644223c9-410d-4ce9-b1a7-d6137d46f3cf","Type":"ContainerDied","Data":"1ec53884f39b464cdcdebf5e1b855b078e887cebdd284c2afa448443a542fa99"} Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.887572 5036 generic.go:334] "Generic (PLEG): container finished" podID="644223c9-410d-4ce9-b1a7-d6137d46f3cf" containerID="1ec53884f39b464cdcdebf5e1b855b078e887cebdd284c2afa448443a542fa99" exitCode=0 Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.887817 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-d24s2" event={"ID":"644223c9-410d-4ce9-b1a7-d6137d46f3cf","Type":"ContainerStarted","Data":"d180800a1ad62efbccc6cf15ab7c6a64c37c554bfd0fa1805d6ee73df28f8195"} Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.889547 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"70cfbefa-2928-4ca5-aa74-93fb1b4cd059","Type":"ContainerStarted","Data":"99cbc3f5d194d5dad475e340bc74144125af64a38343f0519b8310a1bb76c208"} Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.893991 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-sz6zr" event={"ID":"e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c","Type":"ContainerStarted","Data":"aaa9e8a5cd5832c9ece96199964f96f5f7f636a5e4e2aeac5041907f8d4864c1"} Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.894270 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-sz6zr" event={"ID":"e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c","Type":"ContainerStarted","Data":"b69d40d8d948324021dc956ea2984bd47ccfd3c496dc6f2fb8c943416751bd05"} Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.900375 5036 generic.go:334] "Generic (PLEG): container finished" podID="f52d4419-3cc2-47fb-8a1f-5b086a2660a9" containerID="df635eed0a19abd68f98e6e2dd3b7d65c16d69333e7adc57a6235c75b259d1be" exitCode=0 Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.900491 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-d6g85" event={"ID":"f52d4419-3cc2-47fb-8a1f-5b086a2660a9","Type":"ContainerDied","Data":"df635eed0a19abd68f98e6e2dd3b7d65c16d69333e7adc57a6235c75b259d1be"} Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.900538 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-d6g85" event={"ID":"f52d4419-3cc2-47fb-8a1f-5b086a2660a9","Type":"ContainerStarted","Data":"1d11cb6204c8fd6881c5377b75970d9cae4b6c5f69c7446763a8e140a95f860f"} Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.911022 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1e9d-account-create-update-xf4tj" event={"ID":"e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56","Type":"ContainerStarted","Data":"c32f8ba1b80961763f2faa4b29618b0f7d2c21750aca2adb5f383af25c067818"} Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.911071 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1e9d-account-create-update-xf4tj" event={"ID":"e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56","Type":"ContainerStarted","Data":"3314fbc0b82d51bb72820bc92154b456e2265ec64b79f7ae7ded960ae0259caa"} Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.928058 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6758-account-create-update-mbg5f" event={"ID":"d5135e7f-1fec-4960-ab32-eeb7901e1a4d","Type":"ContainerStarted","Data":"c3accb81973c6059402e04d949e82809211eaa8ac93cdf0a430d82051a1859c4"} Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.928092 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6758-account-create-update-mbg5f" event={"ID":"d5135e7f-1fec-4960-ab32-eeb7901e1a4d","Type":"ContainerStarted","Data":"a04246d0055cac39332b119adff2b1e8b22991b4e8ab5d0be0d73e8b2ab965bd"} Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.929906 5036 generic.go:334] "Generic (PLEG): container finished" podID="4589fdc9-748c-41e7-ba5b-493750149d60" containerID="602aabefb458eef6fc29fddff81aba1cabbfa80cdd04d75fa7163fbef6f386be" exitCode=0 Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.929935 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b3eb-account-create-update-bln2k" event={"ID":"4589fdc9-748c-41e7-ba5b-493750149d60","Type":"ContainerDied","Data":"602aabefb458eef6fc29fddff81aba1cabbfa80cdd04d75fa7163fbef6f386be"} Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.929951 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b3eb-account-create-update-bln2k" event={"ID":"4589fdc9-748c-41e7-ba5b-493750149d60","Type":"ContainerStarted","Data":"335c7d4b24aaffdc9ecfda6efeeed93bc0fae034b1a89faeb62c8ec16c4217e5"} Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.934603 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-sz6zr" podStartSLOduration=7.934585028 podStartE2EDuration="7.934585028s" podCreationTimestamp="2026-01-10 16:45:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:45:11.921182147 +0000 UTC m=+1033.791417651" watchObservedRunningTime="2026-01-10 16:45:11.934585028 +0000 UTC m=+1033.804820522" Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.967900 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.894506281 podStartE2EDuration="14.967873403s" podCreationTimestamp="2026-01-10 16:44:57 +0000 UTC" firstStartedPulling="2026-01-10 16:44:58.319296958 +0000 UTC m=+1020.189532452" lastFinishedPulling="2026-01-10 16:45:10.39266408 +0000 UTC m=+1032.262899574" observedRunningTime="2026-01-10 16:45:11.958536618 +0000 UTC m=+1033.828772112" watchObservedRunningTime="2026-01-10 16:45:11.967873403 +0000 UTC m=+1033.838108897" Jan 10 16:45:11 crc kubenswrapper[5036]: I0110 16:45:11.981075 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-1e9d-account-create-update-xf4tj" podStartSLOduration=6.981054408 podStartE2EDuration="6.981054408s" podCreationTimestamp="2026-01-10 16:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:45:11.973976737 +0000 UTC m=+1033.844212231" watchObservedRunningTime="2026-01-10 16:45:11.981054408 +0000 UTC m=+1033.851289902" Jan 10 16:45:12 crc kubenswrapper[5036]: I0110 16:45:12.013941 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-6758-account-create-update-mbg5f" podStartSLOduration=7.013922061 podStartE2EDuration="7.013922061s" podCreationTimestamp="2026-01-10 16:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:45:12.003068283 +0000 UTC m=+1033.873303777" watchObservedRunningTime="2026-01-10 16:45:12.013922061 +0000 UTC m=+1033.884157555" Jan 10 16:45:12 crc kubenswrapper[5036]: I0110 16:45:12.943905 5036 generic.go:334] "Generic (PLEG): container finished" podID="d5135e7f-1fec-4960-ab32-eeb7901e1a4d" containerID="c3accb81973c6059402e04d949e82809211eaa8ac93cdf0a430d82051a1859c4" exitCode=0 Jan 10 16:45:12 crc kubenswrapper[5036]: I0110 16:45:12.944039 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6758-account-create-update-mbg5f" event={"ID":"d5135e7f-1fec-4960-ab32-eeb7901e1a4d","Type":"ContainerDied","Data":"c3accb81973c6059402e04d949e82809211eaa8ac93cdf0a430d82051a1859c4"} Jan 10 16:45:12 crc kubenswrapper[5036]: I0110 16:45:12.946574 5036 generic.go:334] "Generic (PLEG): container finished" podID="e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c" containerID="aaa9e8a5cd5832c9ece96199964f96f5f7f636a5e4e2aeac5041907f8d4864c1" exitCode=0 Jan 10 16:45:12 crc kubenswrapper[5036]: I0110 16:45:12.946660 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-sz6zr" event={"ID":"e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c","Type":"ContainerDied","Data":"aaa9e8a5cd5832c9ece96199964f96f5f7f636a5e4e2aeac5041907f8d4864c1"} Jan 10 16:45:12 crc kubenswrapper[5036]: I0110 16:45:12.949078 5036 generic.go:334] "Generic (PLEG): container finished" podID="e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56" containerID="c32f8ba1b80961763f2faa4b29618b0f7d2c21750aca2adb5f383af25c067818" exitCode=0 Jan 10 16:45:12 crc kubenswrapper[5036]: I0110 16:45:12.949147 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1e9d-account-create-update-xf4tj" event={"ID":"e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56","Type":"ContainerDied","Data":"c32f8ba1b80961763f2faa4b29618b0f7d2c21750aca2adb5f383af25c067818"} Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.060221 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.060515 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="160d1d1b-ff02-4e83-8f14-35f21877666a" containerName="ceilometer-central-agent" containerID="cri-o://9799122ac9cadc514c6fd60701f5da3efe5af896d32d9a346b0cdb47f0541856" gracePeriod=30 Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.060778 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="160d1d1b-ff02-4e83-8f14-35f21877666a" containerName="proxy-httpd" containerID="cri-o://4be2ffd36cc0ed2610d2f497f412517d63926a155d7077935831086234c81d70" gracePeriod=30 Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.061155 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="160d1d1b-ff02-4e83-8f14-35f21877666a" containerName="ceilometer-notification-agent" containerID="cri-o://4b6a521a87e11a81dcf372ef9d887f370255d1aed3c815e5dc2a5e224e733d2a" gracePeriod=30 Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.061217 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="160d1d1b-ff02-4e83-8f14-35f21877666a" containerName="sg-core" containerID="cri-o://9e3efd2ace4f39b8c56d3a06ba8e665c0d02dec1e74513bab6edfc7c5675bec4" gracePeriod=30 Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.389513 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-d6g85" Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.509790 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-d24s2" Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.514428 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b3eb-account-create-update-bln2k" Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.681759 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f52d4419-3cc2-47fb-8a1f-5b086a2660a9-operator-scripts\") pod \"f52d4419-3cc2-47fb-8a1f-5b086a2660a9\" (UID: \"f52d4419-3cc2-47fb-8a1f-5b086a2660a9\") " Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.681976 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqbg9\" (UniqueName: \"kubernetes.io/projected/4589fdc9-748c-41e7-ba5b-493750149d60-kube-api-access-rqbg9\") pod \"4589fdc9-748c-41e7-ba5b-493750149d60\" (UID: \"4589fdc9-748c-41e7-ba5b-493750149d60\") " Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.682018 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5ddp\" (UniqueName: \"kubernetes.io/projected/f52d4419-3cc2-47fb-8a1f-5b086a2660a9-kube-api-access-d5ddp\") pod \"f52d4419-3cc2-47fb-8a1f-5b086a2660a9\" (UID: \"f52d4419-3cc2-47fb-8a1f-5b086a2660a9\") " Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.682058 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/644223c9-410d-4ce9-b1a7-d6137d46f3cf-operator-scripts\") pod \"644223c9-410d-4ce9-b1a7-d6137d46f3cf\" (UID: \"644223c9-410d-4ce9-b1a7-d6137d46f3cf\") " Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.682082 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmck9\" (UniqueName: \"kubernetes.io/projected/644223c9-410d-4ce9-b1a7-d6137d46f3cf-kube-api-access-fmck9\") pod \"644223c9-410d-4ce9-b1a7-d6137d46f3cf\" (UID: \"644223c9-410d-4ce9-b1a7-d6137d46f3cf\") " Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.682111 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4589fdc9-748c-41e7-ba5b-493750149d60-operator-scripts\") pod \"4589fdc9-748c-41e7-ba5b-493750149d60\" (UID: \"4589fdc9-748c-41e7-ba5b-493750149d60\") " Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.682643 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f52d4419-3cc2-47fb-8a1f-5b086a2660a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f52d4419-3cc2-47fb-8a1f-5b086a2660a9" (UID: "f52d4419-3cc2-47fb-8a1f-5b086a2660a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.682900 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/644223c9-410d-4ce9-b1a7-d6137d46f3cf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "644223c9-410d-4ce9-b1a7-d6137d46f3cf" (UID: "644223c9-410d-4ce9-b1a7-d6137d46f3cf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.683348 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4589fdc9-748c-41e7-ba5b-493750149d60-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4589fdc9-748c-41e7-ba5b-493750149d60" (UID: "4589fdc9-748c-41e7-ba5b-493750149d60"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.690481 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f52d4419-3cc2-47fb-8a1f-5b086a2660a9-kube-api-access-d5ddp" (OuterVolumeSpecName: "kube-api-access-d5ddp") pod "f52d4419-3cc2-47fb-8a1f-5b086a2660a9" (UID: "f52d4419-3cc2-47fb-8a1f-5b086a2660a9"). InnerVolumeSpecName "kube-api-access-d5ddp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.690640 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4589fdc9-748c-41e7-ba5b-493750149d60-kube-api-access-rqbg9" (OuterVolumeSpecName: "kube-api-access-rqbg9") pod "4589fdc9-748c-41e7-ba5b-493750149d60" (UID: "4589fdc9-748c-41e7-ba5b-493750149d60"). InnerVolumeSpecName "kube-api-access-rqbg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.691337 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/644223c9-410d-4ce9-b1a7-d6137d46f3cf-kube-api-access-fmck9" (OuterVolumeSpecName: "kube-api-access-fmck9") pod "644223c9-410d-4ce9-b1a7-d6137d46f3cf" (UID: "644223c9-410d-4ce9-b1a7-d6137d46f3cf"). InnerVolumeSpecName "kube-api-access-fmck9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.783423 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5ddp\" (UniqueName: \"kubernetes.io/projected/f52d4419-3cc2-47fb-8a1f-5b086a2660a9-kube-api-access-d5ddp\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.783456 5036 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/644223c9-410d-4ce9-b1a7-d6137d46f3cf-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.783465 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmck9\" (UniqueName: \"kubernetes.io/projected/644223c9-410d-4ce9-b1a7-d6137d46f3cf-kube-api-access-fmck9\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.783474 5036 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4589fdc9-748c-41e7-ba5b-493750149d60-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.783482 5036 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f52d4419-3cc2-47fb-8a1f-5b086a2660a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.783489 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqbg9\" (UniqueName: \"kubernetes.io/projected/4589fdc9-748c-41e7-ba5b-493750149d60-kube-api-access-rqbg9\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.957543 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b3eb-account-create-update-bln2k" event={"ID":"4589fdc9-748c-41e7-ba5b-493750149d60","Type":"ContainerDied","Data":"335c7d4b24aaffdc9ecfda6efeeed93bc0fae034b1a89faeb62c8ec16c4217e5"} Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.957580 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="335c7d4b24aaffdc9ecfda6efeeed93bc0fae034b1a89faeb62c8ec16c4217e5" Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.957628 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b3eb-account-create-update-bln2k" Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.970338 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-d24s2" Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.970340 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-d24s2" event={"ID":"644223c9-410d-4ce9-b1a7-d6137d46f3cf","Type":"ContainerDied","Data":"d180800a1ad62efbccc6cf15ab7c6a64c37c554bfd0fa1805d6ee73df28f8195"} Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.970383 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d180800a1ad62efbccc6cf15ab7c6a64c37c554bfd0fa1805d6ee73df28f8195" Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.977045 5036 generic.go:334] "Generic (PLEG): container finished" podID="160d1d1b-ff02-4e83-8f14-35f21877666a" containerID="4be2ffd36cc0ed2610d2f497f412517d63926a155d7077935831086234c81d70" exitCode=0 Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.977081 5036 generic.go:334] "Generic (PLEG): container finished" podID="160d1d1b-ff02-4e83-8f14-35f21877666a" containerID="9e3efd2ace4f39b8c56d3a06ba8e665c0d02dec1e74513bab6edfc7c5675bec4" exitCode=2 Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.977091 5036 generic.go:334] "Generic (PLEG): container finished" podID="160d1d1b-ff02-4e83-8f14-35f21877666a" containerID="9799122ac9cadc514c6fd60701f5da3efe5af896d32d9a346b0cdb47f0541856" exitCode=0 Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.977108 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"160d1d1b-ff02-4e83-8f14-35f21877666a","Type":"ContainerDied","Data":"4be2ffd36cc0ed2610d2f497f412517d63926a155d7077935831086234c81d70"} Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.977157 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"160d1d1b-ff02-4e83-8f14-35f21877666a","Type":"ContainerDied","Data":"9e3efd2ace4f39b8c56d3a06ba8e665c0d02dec1e74513bab6edfc7c5675bec4"} Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.977170 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"160d1d1b-ff02-4e83-8f14-35f21877666a","Type":"ContainerDied","Data":"9799122ac9cadc514c6fd60701f5da3efe5af896d32d9a346b0cdb47f0541856"} Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.978505 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-d6g85" event={"ID":"f52d4419-3cc2-47fb-8a1f-5b086a2660a9","Type":"ContainerDied","Data":"1d11cb6204c8fd6881c5377b75970d9cae4b6c5f69c7446763a8e140a95f860f"} Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.978544 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d11cb6204c8fd6881c5377b75970d9cae4b6c5f69c7446763a8e140a95f860f" Jan 10 16:45:13 crc kubenswrapper[5036]: I0110 16:45:13.978559 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-d6g85" Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.375272 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1e9d-account-create-update-xf4tj" Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.465376 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6758-account-create-update-mbg5f" Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.472290 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-sz6zr" Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.495516 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56-operator-scripts\") pod \"e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56\" (UID: \"e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56\") " Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.495722 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6pt6\" (UniqueName: \"kubernetes.io/projected/e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56-kube-api-access-v6pt6\") pod \"e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56\" (UID: \"e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56\") " Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.496670 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56" (UID: "e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.512608 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56-kube-api-access-v6pt6" (OuterVolumeSpecName: "kube-api-access-v6pt6") pod "e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56" (UID: "e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56"). InnerVolumeSpecName "kube-api-access-v6pt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.597758 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74t4d\" (UniqueName: \"kubernetes.io/projected/d5135e7f-1fec-4960-ab32-eeb7901e1a4d-kube-api-access-74t4d\") pod \"d5135e7f-1fec-4960-ab32-eeb7901e1a4d\" (UID: \"d5135e7f-1fec-4960-ab32-eeb7901e1a4d\") " Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.598473 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c-operator-scripts\") pod \"e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c\" (UID: \"e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c\") " Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.598512 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5135e7f-1fec-4960-ab32-eeb7901e1a4d-operator-scripts\") pod \"d5135e7f-1fec-4960-ab32-eeb7901e1a4d\" (UID: \"d5135e7f-1fec-4960-ab32-eeb7901e1a4d\") " Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.598647 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69rw6\" (UniqueName: \"kubernetes.io/projected/e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c-kube-api-access-69rw6\") pod \"e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c\" (UID: \"e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c\") " Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.599115 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c" (UID: "e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.599270 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6pt6\" (UniqueName: \"kubernetes.io/projected/e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56-kube-api-access-v6pt6\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.599371 5036 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.599438 5036 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.600456 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5135e7f-1fec-4960-ab32-eeb7901e1a4d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d5135e7f-1fec-4960-ab32-eeb7901e1a4d" (UID: "d5135e7f-1fec-4960-ab32-eeb7901e1a4d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.601959 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c-kube-api-access-69rw6" (OuterVolumeSpecName: "kube-api-access-69rw6") pod "e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c" (UID: "e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c"). InnerVolumeSpecName "kube-api-access-69rw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.603837 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5135e7f-1fec-4960-ab32-eeb7901e1a4d-kube-api-access-74t4d" (OuterVolumeSpecName: "kube-api-access-74t4d") pod "d5135e7f-1fec-4960-ab32-eeb7901e1a4d" (UID: "d5135e7f-1fec-4960-ab32-eeb7901e1a4d"). InnerVolumeSpecName "kube-api-access-74t4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.702122 5036 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5135e7f-1fec-4960-ab32-eeb7901e1a4d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.702170 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69rw6\" (UniqueName: \"kubernetes.io/projected/e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c-kube-api-access-69rw6\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.702188 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74t4d\" (UniqueName: \"kubernetes.io/projected/d5135e7f-1fec-4960-ab32-eeb7901e1a4d-kube-api-access-74t4d\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.988808 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6758-account-create-update-mbg5f" event={"ID":"d5135e7f-1fec-4960-ab32-eeb7901e1a4d","Type":"ContainerDied","Data":"a04246d0055cac39332b119adff2b1e8b22991b4e8ab5d0be0d73e8b2ab965bd"} Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.988854 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a04246d0055cac39332b119adff2b1e8b22991b4e8ab5d0be0d73e8b2ab965bd" Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.988814 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6758-account-create-update-mbg5f" Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.990214 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-sz6zr" Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.990222 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-sz6zr" event={"ID":"e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c","Type":"ContainerDied","Data":"b69d40d8d948324021dc956ea2984bd47ccfd3c496dc6f2fb8c943416751bd05"} Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.990288 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b69d40d8d948324021dc956ea2984bd47ccfd3c496dc6f2fb8c943416751bd05" Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.991757 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1e9d-account-create-update-xf4tj" event={"ID":"e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56","Type":"ContainerDied","Data":"3314fbc0b82d51bb72820bc92154b456e2265ec64b79f7ae7ded960ae0259caa"} Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.991798 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3314fbc0b82d51bb72820bc92154b456e2265ec64b79f7ae7ded960ae0259caa" Jan 10 16:45:14 crc kubenswrapper[5036]: I0110 16:45:14.991823 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1e9d-account-create-update-xf4tj" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.513551 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pqkrg"] Jan 10 16:45:15 crc kubenswrapper[5036]: E0110 16:45:15.514097 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f52d4419-3cc2-47fb-8a1f-5b086a2660a9" containerName="mariadb-database-create" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.514116 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f52d4419-3cc2-47fb-8a1f-5b086a2660a9" containerName="mariadb-database-create" Jan 10 16:45:15 crc kubenswrapper[5036]: E0110 16:45:15.514164 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5135e7f-1fec-4960-ab32-eeb7901e1a4d" containerName="mariadb-account-create-update" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.514173 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5135e7f-1fec-4960-ab32-eeb7901e1a4d" containerName="mariadb-account-create-update" Jan 10 16:45:15 crc kubenswrapper[5036]: E0110 16:45:15.514191 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4589fdc9-748c-41e7-ba5b-493750149d60" containerName="mariadb-account-create-update" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.514198 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="4589fdc9-748c-41e7-ba5b-493750149d60" containerName="mariadb-account-create-update" Jan 10 16:45:15 crc kubenswrapper[5036]: E0110 16:45:15.514209 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="644223c9-410d-4ce9-b1a7-d6137d46f3cf" containerName="mariadb-database-create" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.514215 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="644223c9-410d-4ce9-b1a7-d6137d46f3cf" containerName="mariadb-database-create" Jan 10 16:45:15 crc kubenswrapper[5036]: E0110 16:45:15.514254 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d1b58ad-b491-4354-a0b0-3ab868370dc9" containerName="collect-profiles" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.514262 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d1b58ad-b491-4354-a0b0-3ab868370dc9" containerName="collect-profiles" Jan 10 16:45:15 crc kubenswrapper[5036]: E0110 16:45:15.514273 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c" containerName="mariadb-database-create" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.514279 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c" containerName="mariadb-database-create" Jan 10 16:45:15 crc kubenswrapper[5036]: E0110 16:45:15.514290 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56" containerName="mariadb-account-create-update" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.514296 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56" containerName="mariadb-account-create-update" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.514583 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c" containerName="mariadb-database-create" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.514599 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5135e7f-1fec-4960-ab32-eeb7901e1a4d" containerName="mariadb-account-create-update" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.514612 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="4589fdc9-748c-41e7-ba5b-493750149d60" containerName="mariadb-account-create-update" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.514619 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="f52d4419-3cc2-47fb-8a1f-5b086a2660a9" containerName="mariadb-database-create" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.514642 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56" containerName="mariadb-account-create-update" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.514654 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d1b58ad-b491-4354-a0b0-3ab868370dc9" containerName="collect-profiles" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.514662 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="644223c9-410d-4ce9-b1a7-d6137d46f3cf" containerName="mariadb-database-create" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.515284 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pqkrg" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.518121 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.518458 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.523776 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-j2fq5" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.526467 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pqkrg"] Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.718240 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ff51de8-7e61-4799-b5fc-24e294ec8050-scripts\") pod \"nova-cell0-conductor-db-sync-pqkrg\" (UID: \"3ff51de8-7e61-4799-b5fc-24e294ec8050\") " pod="openstack/nova-cell0-conductor-db-sync-pqkrg" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.718835 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rc27\" (UniqueName: \"kubernetes.io/projected/3ff51de8-7e61-4799-b5fc-24e294ec8050-kube-api-access-5rc27\") pod \"nova-cell0-conductor-db-sync-pqkrg\" (UID: \"3ff51de8-7e61-4799-b5fc-24e294ec8050\") " pod="openstack/nova-cell0-conductor-db-sync-pqkrg" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.719048 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff51de8-7e61-4799-b5fc-24e294ec8050-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pqkrg\" (UID: \"3ff51de8-7e61-4799-b5fc-24e294ec8050\") " pod="openstack/nova-cell0-conductor-db-sync-pqkrg" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.719116 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff51de8-7e61-4799-b5fc-24e294ec8050-config-data\") pod \"nova-cell0-conductor-db-sync-pqkrg\" (UID: \"3ff51de8-7e61-4799-b5fc-24e294ec8050\") " pod="openstack/nova-cell0-conductor-db-sync-pqkrg" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.821063 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff51de8-7e61-4799-b5fc-24e294ec8050-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pqkrg\" (UID: \"3ff51de8-7e61-4799-b5fc-24e294ec8050\") " pod="openstack/nova-cell0-conductor-db-sync-pqkrg" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.821130 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff51de8-7e61-4799-b5fc-24e294ec8050-config-data\") pod \"nova-cell0-conductor-db-sync-pqkrg\" (UID: \"3ff51de8-7e61-4799-b5fc-24e294ec8050\") " pod="openstack/nova-cell0-conductor-db-sync-pqkrg" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.821212 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ff51de8-7e61-4799-b5fc-24e294ec8050-scripts\") pod \"nova-cell0-conductor-db-sync-pqkrg\" (UID: \"3ff51de8-7e61-4799-b5fc-24e294ec8050\") " pod="openstack/nova-cell0-conductor-db-sync-pqkrg" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.821263 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rc27\" (UniqueName: \"kubernetes.io/projected/3ff51de8-7e61-4799-b5fc-24e294ec8050-kube-api-access-5rc27\") pod \"nova-cell0-conductor-db-sync-pqkrg\" (UID: \"3ff51de8-7e61-4799-b5fc-24e294ec8050\") " pod="openstack/nova-cell0-conductor-db-sync-pqkrg" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.827986 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff51de8-7e61-4799-b5fc-24e294ec8050-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pqkrg\" (UID: \"3ff51de8-7e61-4799-b5fc-24e294ec8050\") " pod="openstack/nova-cell0-conductor-db-sync-pqkrg" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.834455 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff51de8-7e61-4799-b5fc-24e294ec8050-config-data\") pod \"nova-cell0-conductor-db-sync-pqkrg\" (UID: \"3ff51de8-7e61-4799-b5fc-24e294ec8050\") " pod="openstack/nova-cell0-conductor-db-sync-pqkrg" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.869182 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ff51de8-7e61-4799-b5fc-24e294ec8050-scripts\") pod \"nova-cell0-conductor-db-sync-pqkrg\" (UID: \"3ff51de8-7e61-4799-b5fc-24e294ec8050\") " pod="openstack/nova-cell0-conductor-db-sync-pqkrg" Jan 10 16:45:15 crc kubenswrapper[5036]: I0110 16:45:15.871405 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rc27\" (UniqueName: \"kubernetes.io/projected/3ff51de8-7e61-4799-b5fc-24e294ec8050-kube-api-access-5rc27\") pod \"nova-cell0-conductor-db-sync-pqkrg\" (UID: \"3ff51de8-7e61-4799-b5fc-24e294ec8050\") " pod="openstack/nova-cell0-conductor-db-sync-pqkrg" Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.036566 5036 generic.go:334] "Generic (PLEG): container finished" podID="160d1d1b-ff02-4e83-8f14-35f21877666a" containerID="4b6a521a87e11a81dcf372ef9d887f370255d1aed3c815e5dc2a5e224e733d2a" exitCode=0 Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.036610 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"160d1d1b-ff02-4e83-8f14-35f21877666a","Type":"ContainerDied","Data":"4b6a521a87e11a81dcf372ef9d887f370255d1aed3c815e5dc2a5e224e733d2a"} Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.145754 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pqkrg" Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.362044 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.535232 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/160d1d1b-ff02-4e83-8f14-35f21877666a-config-data\") pod \"160d1d1b-ff02-4e83-8f14-35f21877666a\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.535609 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc445\" (UniqueName: \"kubernetes.io/projected/160d1d1b-ff02-4e83-8f14-35f21877666a-kube-api-access-wc445\") pod \"160d1d1b-ff02-4e83-8f14-35f21877666a\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.535662 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/160d1d1b-ff02-4e83-8f14-35f21877666a-sg-core-conf-yaml\") pod \"160d1d1b-ff02-4e83-8f14-35f21877666a\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.535873 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/160d1d1b-ff02-4e83-8f14-35f21877666a-scripts\") pod \"160d1d1b-ff02-4e83-8f14-35f21877666a\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.535913 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/160d1d1b-ff02-4e83-8f14-35f21877666a-combined-ca-bundle\") pod \"160d1d1b-ff02-4e83-8f14-35f21877666a\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.535983 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/160d1d1b-ff02-4e83-8f14-35f21877666a-run-httpd\") pod \"160d1d1b-ff02-4e83-8f14-35f21877666a\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.536051 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/160d1d1b-ff02-4e83-8f14-35f21877666a-log-httpd\") pod \"160d1d1b-ff02-4e83-8f14-35f21877666a\" (UID: \"160d1d1b-ff02-4e83-8f14-35f21877666a\") " Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.536963 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/160d1d1b-ff02-4e83-8f14-35f21877666a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "160d1d1b-ff02-4e83-8f14-35f21877666a" (UID: "160d1d1b-ff02-4e83-8f14-35f21877666a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.538132 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/160d1d1b-ff02-4e83-8f14-35f21877666a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "160d1d1b-ff02-4e83-8f14-35f21877666a" (UID: "160d1d1b-ff02-4e83-8f14-35f21877666a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.541698 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/160d1d1b-ff02-4e83-8f14-35f21877666a-kube-api-access-wc445" (OuterVolumeSpecName: "kube-api-access-wc445") pod "160d1d1b-ff02-4e83-8f14-35f21877666a" (UID: "160d1d1b-ff02-4e83-8f14-35f21877666a"). InnerVolumeSpecName "kube-api-access-wc445". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.549927 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/160d1d1b-ff02-4e83-8f14-35f21877666a-scripts" (OuterVolumeSpecName: "scripts") pod "160d1d1b-ff02-4e83-8f14-35f21877666a" (UID: "160d1d1b-ff02-4e83-8f14-35f21877666a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.565198 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/160d1d1b-ff02-4e83-8f14-35f21877666a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "160d1d1b-ff02-4e83-8f14-35f21877666a" (UID: "160d1d1b-ff02-4e83-8f14-35f21877666a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.624795 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/160d1d1b-ff02-4e83-8f14-35f21877666a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "160d1d1b-ff02-4e83-8f14-35f21877666a" (UID: "160d1d1b-ff02-4e83-8f14-35f21877666a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.640092 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc445\" (UniqueName: \"kubernetes.io/projected/160d1d1b-ff02-4e83-8f14-35f21877666a-kube-api-access-wc445\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.640133 5036 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/160d1d1b-ff02-4e83-8f14-35f21877666a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.640146 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/160d1d1b-ff02-4e83-8f14-35f21877666a-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.640158 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/160d1d1b-ff02-4e83-8f14-35f21877666a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.640169 5036 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/160d1d1b-ff02-4e83-8f14-35f21877666a-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.640180 5036 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/160d1d1b-ff02-4e83-8f14-35f21877666a-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.640714 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pqkrg"] Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.660672 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/160d1d1b-ff02-4e83-8f14-35f21877666a-config-data" (OuterVolumeSpecName: "config-data") pod "160d1d1b-ff02-4e83-8f14-35f21877666a" (UID: "160d1d1b-ff02-4e83-8f14-35f21877666a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:16 crc kubenswrapper[5036]: I0110 16:45:16.741960 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/160d1d1b-ff02-4e83-8f14-35f21877666a-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.048890 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"160d1d1b-ff02-4e83-8f14-35f21877666a","Type":"ContainerDied","Data":"56ce3b2a2af81f44226d3f4737cb8c809135adadec95a794922aa3fa50874f33"} Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.048943 5036 scope.go:117] "RemoveContainer" containerID="4be2ffd36cc0ed2610d2f497f412517d63926a155d7077935831086234c81d70" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.049243 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.050306 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pqkrg" event={"ID":"3ff51de8-7e61-4799-b5fc-24e294ec8050","Type":"ContainerStarted","Data":"b8ef109761eeee9dca5c8717d655d989956391a61a36b36305aa142fa5af882b"} Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.091185 5036 scope.go:117] "RemoveContainer" containerID="9e3efd2ace4f39b8c56d3a06ba8e665c0d02dec1e74513bab6edfc7c5675bec4" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.105313 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.114398 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.121491 5036 scope.go:117] "RemoveContainer" containerID="4b6a521a87e11a81dcf372ef9d887f370255d1aed3c815e5dc2a5e224e733d2a" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.143837 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:45:17 crc kubenswrapper[5036]: E0110 16:45:17.144309 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="160d1d1b-ff02-4e83-8f14-35f21877666a" containerName="sg-core" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.144330 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="160d1d1b-ff02-4e83-8f14-35f21877666a" containerName="sg-core" Jan 10 16:45:17 crc kubenswrapper[5036]: E0110 16:45:17.144347 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="160d1d1b-ff02-4e83-8f14-35f21877666a" containerName="proxy-httpd" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.144354 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="160d1d1b-ff02-4e83-8f14-35f21877666a" containerName="proxy-httpd" Jan 10 16:45:17 crc kubenswrapper[5036]: E0110 16:45:17.144368 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="160d1d1b-ff02-4e83-8f14-35f21877666a" containerName="ceilometer-notification-agent" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.144377 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="160d1d1b-ff02-4e83-8f14-35f21877666a" containerName="ceilometer-notification-agent" Jan 10 16:45:17 crc kubenswrapper[5036]: E0110 16:45:17.144408 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="160d1d1b-ff02-4e83-8f14-35f21877666a" containerName="ceilometer-central-agent" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.144415 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="160d1d1b-ff02-4e83-8f14-35f21877666a" containerName="ceilometer-central-agent" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.144566 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="160d1d1b-ff02-4e83-8f14-35f21877666a" containerName="ceilometer-central-agent" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.144577 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="160d1d1b-ff02-4e83-8f14-35f21877666a" containerName="proxy-httpd" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.144586 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="160d1d1b-ff02-4e83-8f14-35f21877666a" containerName="sg-core" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.144596 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="160d1d1b-ff02-4e83-8f14-35f21877666a" containerName="ceilometer-notification-agent" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.146112 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.149124 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.149286 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.151820 5036 scope.go:117] "RemoveContainer" containerID="9799122ac9cadc514c6fd60701f5da3efe5af896d32d9a346b0cdb47f0541856" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.152860 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.255566 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c69521dc-96b5-46cb-b203-2a798dc9113e-log-httpd\") pod \"ceilometer-0\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " pod="openstack/ceilometer-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.256042 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8vnk\" (UniqueName: \"kubernetes.io/projected/c69521dc-96b5-46cb-b203-2a798dc9113e-kube-api-access-p8vnk\") pod \"ceilometer-0\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " pod="openstack/ceilometer-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.256186 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c69521dc-96b5-46cb-b203-2a798dc9113e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " pod="openstack/ceilometer-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.256296 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c69521dc-96b5-46cb-b203-2a798dc9113e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " pod="openstack/ceilometer-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.256360 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c69521dc-96b5-46cb-b203-2a798dc9113e-run-httpd\") pod \"ceilometer-0\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " pod="openstack/ceilometer-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.256505 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c69521dc-96b5-46cb-b203-2a798dc9113e-config-data\") pod \"ceilometer-0\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " pod="openstack/ceilometer-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.256567 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c69521dc-96b5-46cb-b203-2a798dc9113e-scripts\") pod \"ceilometer-0\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " pod="openstack/ceilometer-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.358211 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c69521dc-96b5-46cb-b203-2a798dc9113e-log-httpd\") pod \"ceilometer-0\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " pod="openstack/ceilometer-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.358278 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8vnk\" (UniqueName: \"kubernetes.io/projected/c69521dc-96b5-46cb-b203-2a798dc9113e-kube-api-access-p8vnk\") pod \"ceilometer-0\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " pod="openstack/ceilometer-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.358328 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c69521dc-96b5-46cb-b203-2a798dc9113e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " pod="openstack/ceilometer-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.358360 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c69521dc-96b5-46cb-b203-2a798dc9113e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " pod="openstack/ceilometer-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.358394 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c69521dc-96b5-46cb-b203-2a798dc9113e-run-httpd\") pod \"ceilometer-0\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " pod="openstack/ceilometer-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.358435 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c69521dc-96b5-46cb-b203-2a798dc9113e-config-data\") pod \"ceilometer-0\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " pod="openstack/ceilometer-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.358472 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c69521dc-96b5-46cb-b203-2a798dc9113e-scripts\") pod \"ceilometer-0\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " pod="openstack/ceilometer-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.359097 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c69521dc-96b5-46cb-b203-2a798dc9113e-log-httpd\") pod \"ceilometer-0\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " pod="openstack/ceilometer-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.359809 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c69521dc-96b5-46cb-b203-2a798dc9113e-run-httpd\") pod \"ceilometer-0\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " pod="openstack/ceilometer-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.363190 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c69521dc-96b5-46cb-b203-2a798dc9113e-scripts\") pod \"ceilometer-0\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " pod="openstack/ceilometer-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.363223 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c69521dc-96b5-46cb-b203-2a798dc9113e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " pod="openstack/ceilometer-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.364942 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c69521dc-96b5-46cb-b203-2a798dc9113e-config-data\") pod \"ceilometer-0\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " pod="openstack/ceilometer-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.367668 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c69521dc-96b5-46cb-b203-2a798dc9113e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " pod="openstack/ceilometer-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.381334 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8vnk\" (UniqueName: \"kubernetes.io/projected/c69521dc-96b5-46cb-b203-2a798dc9113e-kube-api-access-p8vnk\") pod \"ceilometer-0\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " pod="openstack/ceilometer-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.456876 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.457485 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="66dcc1cf-f7f9-4064-b019-4ec5f205ea03" containerName="kube-state-metrics" containerID="cri-o://a7e1ab85b95d8b789d27309fd1ee4185b214a81b64b7aba6362608d65d419e36" gracePeriod=30 Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.468235 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.936499 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 10 16:45:17 crc kubenswrapper[5036]: I0110 16:45:17.943978 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:45:17 crc kubenswrapper[5036]: W0110 16:45:17.979159 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc69521dc_96b5_46cb_b203_2a798dc9113e.slice/crio-35e6ea63c9f07fb4ca66c6c9a0e365c6991757ef45e454449caa44f5c31c37c5 WatchSource:0}: Error finding container 35e6ea63c9f07fb4ca66c6c9a0e365c6991757ef45e454449caa44f5c31c37c5: Status 404 returned error can't find the container with id 35e6ea63c9f07fb4ca66c6c9a0e365c6991757ef45e454449caa44f5c31c37c5 Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.061534 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c69521dc-96b5-46cb-b203-2a798dc9113e","Type":"ContainerStarted","Data":"35e6ea63c9f07fb4ca66c6c9a0e365c6991757ef45e454449caa44f5c31c37c5"} Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.063271 5036 generic.go:334] "Generic (PLEG): container finished" podID="66dcc1cf-f7f9-4064-b019-4ec5f205ea03" containerID="a7e1ab85b95d8b789d27309fd1ee4185b214a81b64b7aba6362608d65d419e36" exitCode=2 Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.063328 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"66dcc1cf-f7f9-4064-b019-4ec5f205ea03","Type":"ContainerDied","Data":"a7e1ab85b95d8b789d27309fd1ee4185b214a81b64b7aba6362608d65d419e36"} Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.063331 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.063346 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"66dcc1cf-f7f9-4064-b019-4ec5f205ea03","Type":"ContainerDied","Data":"427a1549c177d1cb90bb460a678a5aa958f202bae0caeebb3303cc6fac996785"} Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.063362 5036 scope.go:117] "RemoveContainer" containerID="a7e1ab85b95d8b789d27309fd1ee4185b214a81b64b7aba6362608d65d419e36" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.073177 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xb8b\" (UniqueName: \"kubernetes.io/projected/66dcc1cf-f7f9-4064-b019-4ec5f205ea03-kube-api-access-5xb8b\") pod \"66dcc1cf-f7f9-4064-b019-4ec5f205ea03\" (UID: \"66dcc1cf-f7f9-4064-b019-4ec5f205ea03\") " Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.079323 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66dcc1cf-f7f9-4064-b019-4ec5f205ea03-kube-api-access-5xb8b" (OuterVolumeSpecName: "kube-api-access-5xb8b") pod "66dcc1cf-f7f9-4064-b019-4ec5f205ea03" (UID: "66dcc1cf-f7f9-4064-b019-4ec5f205ea03"). InnerVolumeSpecName "kube-api-access-5xb8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.089337 5036 scope.go:117] "RemoveContainer" containerID="a7e1ab85b95d8b789d27309fd1ee4185b214a81b64b7aba6362608d65d419e36" Jan 10 16:45:18 crc kubenswrapper[5036]: E0110 16:45:18.089907 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7e1ab85b95d8b789d27309fd1ee4185b214a81b64b7aba6362608d65d419e36\": container with ID starting with a7e1ab85b95d8b789d27309fd1ee4185b214a81b64b7aba6362608d65d419e36 not found: ID does not exist" containerID="a7e1ab85b95d8b789d27309fd1ee4185b214a81b64b7aba6362608d65d419e36" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.089946 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e1ab85b95d8b789d27309fd1ee4185b214a81b64b7aba6362608d65d419e36"} err="failed to get container status \"a7e1ab85b95d8b789d27309fd1ee4185b214a81b64b7aba6362608d65d419e36\": rpc error: code = NotFound desc = could not find container \"a7e1ab85b95d8b789d27309fd1ee4185b214a81b64b7aba6362608d65d419e36\": container with ID starting with a7e1ab85b95d8b789d27309fd1ee4185b214a81b64b7aba6362608d65d419e36 not found: ID does not exist" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.175005 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xb8b\" (UniqueName: \"kubernetes.io/projected/66dcc1cf-f7f9-4064-b019-4ec5f205ea03-kube-api-access-5xb8b\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.420654 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.433940 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.446315 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 10 16:45:18 crc kubenswrapper[5036]: E0110 16:45:18.446764 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66dcc1cf-f7f9-4064-b019-4ec5f205ea03" containerName="kube-state-metrics" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.446786 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="66dcc1cf-f7f9-4064-b019-4ec5f205ea03" containerName="kube-state-metrics" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.446993 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="66dcc1cf-f7f9-4064-b019-4ec5f205ea03" containerName="kube-state-metrics" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.447768 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.451458 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.451838 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.459671 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.544059 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="160d1d1b-ff02-4e83-8f14-35f21877666a" path="/var/lib/kubelet/pods/160d1d1b-ff02-4e83-8f14-35f21877666a/volumes" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.544983 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66dcc1cf-f7f9-4064-b019-4ec5f205ea03" path="/var/lib/kubelet/pods/66dcc1cf-f7f9-4064-b019-4ec5f205ea03/volumes" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.554762 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.583293 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c6502b1-879a-46ee-a2ff-54cece3ee9e6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2c6502b1-879a-46ee-a2ff-54cece3ee9e6\") " pod="openstack/kube-state-metrics-0" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.583336 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2c6502b1-879a-46ee-a2ff-54cece3ee9e6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2c6502b1-879a-46ee-a2ff-54cece3ee9e6\") " pod="openstack/kube-state-metrics-0" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.583390 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm9wk\" (UniqueName: \"kubernetes.io/projected/2c6502b1-879a-46ee-a2ff-54cece3ee9e6-kube-api-access-rm9wk\") pod \"kube-state-metrics-0\" (UID: \"2c6502b1-879a-46ee-a2ff-54cece3ee9e6\") " pod="openstack/kube-state-metrics-0" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.583441 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6502b1-879a-46ee-a2ff-54cece3ee9e6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2c6502b1-879a-46ee-a2ff-54cece3ee9e6\") " pod="openstack/kube-state-metrics-0" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.685528 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c6502b1-879a-46ee-a2ff-54cece3ee9e6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2c6502b1-879a-46ee-a2ff-54cece3ee9e6\") " pod="openstack/kube-state-metrics-0" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.686008 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2c6502b1-879a-46ee-a2ff-54cece3ee9e6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2c6502b1-879a-46ee-a2ff-54cece3ee9e6\") " pod="openstack/kube-state-metrics-0" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.686079 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm9wk\" (UniqueName: \"kubernetes.io/projected/2c6502b1-879a-46ee-a2ff-54cece3ee9e6-kube-api-access-rm9wk\") pod \"kube-state-metrics-0\" (UID: \"2c6502b1-879a-46ee-a2ff-54cece3ee9e6\") " pod="openstack/kube-state-metrics-0" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.686123 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6502b1-879a-46ee-a2ff-54cece3ee9e6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2c6502b1-879a-46ee-a2ff-54cece3ee9e6\") " pod="openstack/kube-state-metrics-0" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.692752 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c6502b1-879a-46ee-a2ff-54cece3ee9e6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2c6502b1-879a-46ee-a2ff-54cece3ee9e6\") " pod="openstack/kube-state-metrics-0" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.695568 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c6502b1-879a-46ee-a2ff-54cece3ee9e6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2c6502b1-879a-46ee-a2ff-54cece3ee9e6\") " pod="openstack/kube-state-metrics-0" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.706085 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2c6502b1-879a-46ee-a2ff-54cece3ee9e6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2c6502b1-879a-46ee-a2ff-54cece3ee9e6\") " pod="openstack/kube-state-metrics-0" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.710064 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm9wk\" (UniqueName: \"kubernetes.io/projected/2c6502b1-879a-46ee-a2ff-54cece3ee9e6-kube-api-access-rm9wk\") pod \"kube-state-metrics-0\" (UID: \"2c6502b1-879a-46ee-a2ff-54cece3ee9e6\") " pod="openstack/kube-state-metrics-0" Jan 10 16:45:18 crc kubenswrapper[5036]: I0110 16:45:18.772103 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 10 16:45:19 crc kubenswrapper[5036]: I0110 16:45:19.075440 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c69521dc-96b5-46cb-b203-2a798dc9113e","Type":"ContainerStarted","Data":"677283f00f05537ef651ded70a1acc0d9d57158284cc9b6e66131727c00e939e"} Jan 10 16:45:19 crc kubenswrapper[5036]: I0110 16:45:19.251321 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 10 16:45:19 crc kubenswrapper[5036]: W0110 16:45:19.254769 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c6502b1_879a_46ee_a2ff_54cece3ee9e6.slice/crio-4b60ad55952b9e2d0c0c499d3b4f3032408e3d7e9d5670bc99919982aae6ff1c WatchSource:0}: Error finding container 4b60ad55952b9e2d0c0c499d3b4f3032408e3d7e9d5670bc99919982aae6ff1c: Status 404 returned error can't find the container with id 4b60ad55952b9e2d0c0c499d3b4f3032408e3d7e9d5670bc99919982aae6ff1c Jan 10 16:45:20 crc kubenswrapper[5036]: I0110 16:45:20.086390 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c69521dc-96b5-46cb-b203-2a798dc9113e","Type":"ContainerStarted","Data":"39503cc2344c82f88559e4a715f3e1417099004d8f9a600a3fe2d99494d80504"} Jan 10 16:45:20 crc kubenswrapper[5036]: I0110 16:45:20.088861 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2c6502b1-879a-46ee-a2ff-54cece3ee9e6","Type":"ContainerStarted","Data":"0c77ce13ca674264279ab851a5f5a054b08bb149a5531f6df20c13ec8b344d9d"} Jan 10 16:45:20 crc kubenswrapper[5036]: I0110 16:45:20.088889 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2c6502b1-879a-46ee-a2ff-54cece3ee9e6","Type":"ContainerStarted","Data":"4b60ad55952b9e2d0c0c499d3b4f3032408e3d7e9d5670bc99919982aae6ff1c"} Jan 10 16:45:20 crc kubenswrapper[5036]: I0110 16:45:20.089083 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 10 16:45:20 crc kubenswrapper[5036]: I0110 16:45:20.109197 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.75982314 podStartE2EDuration="2.109176266s" podCreationTimestamp="2026-01-10 16:45:18 +0000 UTC" firstStartedPulling="2026-01-10 16:45:19.268903931 +0000 UTC m=+1041.139139425" lastFinishedPulling="2026-01-10 16:45:19.618257057 +0000 UTC m=+1041.488492551" observedRunningTime="2026-01-10 16:45:20.105563513 +0000 UTC m=+1041.975799007" watchObservedRunningTime="2026-01-10 16:45:20.109176266 +0000 UTC m=+1041.979411760" Jan 10 16:45:22 crc kubenswrapper[5036]: I0110 16:45:22.869560 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="66dcc1cf-f7f9-4064-b019-4ec5f205ea03" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.103:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 10 16:45:25 crc kubenswrapper[5036]: I0110 16:45:25.904354 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 16:45:25 crc kubenswrapper[5036]: I0110 16:45:25.904915 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 16:45:25 crc kubenswrapper[5036]: I0110 16:45:25.904963 5036 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 16:45:25 crc kubenswrapper[5036]: I0110 16:45:25.905583 5036 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"47b4506ff10880e72e9cad77a434855f34e50bd0e3f4e5d40320d062adfd7136"} pod="openshift-machine-config-operator/machine-config-daemon-kqphb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 10 16:45:25 crc kubenswrapper[5036]: I0110 16:45:25.905636 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" containerID="cri-o://47b4506ff10880e72e9cad77a434855f34e50bd0e3f4e5d40320d062adfd7136" gracePeriod=600 Jan 10 16:45:26 crc kubenswrapper[5036]: I0110 16:45:26.158528 5036 generic.go:334] "Generic (PLEG): container finished" podID="79756361-741e-4470-831b-6ee092bc6277" containerID="47b4506ff10880e72e9cad77a434855f34e50bd0e3f4e5d40320d062adfd7136" exitCode=0 Jan 10 16:45:26 crc kubenswrapper[5036]: I0110 16:45:26.158572 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerDied","Data":"47b4506ff10880e72e9cad77a434855f34e50bd0e3f4e5d40320d062adfd7136"} Jan 10 16:45:26 crc kubenswrapper[5036]: I0110 16:45:26.158922 5036 scope.go:117] "RemoveContainer" containerID="5ab5f37cb035aad8d11f5d80baed8e115b668e21b971e58b556adfab87217a78" Jan 10 16:45:26 crc kubenswrapper[5036]: I0110 16:45:26.163072 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pqkrg" event={"ID":"3ff51de8-7e61-4799-b5fc-24e294ec8050","Type":"ContainerStarted","Data":"5083d291349a6cf55709db754c43d52d0f05538b981c428264263991423167a1"} Jan 10 16:45:26 crc kubenswrapper[5036]: I0110 16:45:26.167325 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c69521dc-96b5-46cb-b203-2a798dc9113e","Type":"ContainerStarted","Data":"b5bb66042ea7e8c433b64716b5beb169e194528b20dd1cac82c971d54b1eb80e"} Jan 10 16:45:26 crc kubenswrapper[5036]: I0110 16:45:26.182712 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-pqkrg" podStartSLOduration=2.106632874 podStartE2EDuration="11.182695149s" podCreationTimestamp="2026-01-10 16:45:15 +0000 UTC" firstStartedPulling="2026-01-10 16:45:16.647469072 +0000 UTC m=+1038.517704556" lastFinishedPulling="2026-01-10 16:45:25.723531337 +0000 UTC m=+1047.593766831" observedRunningTime="2026-01-10 16:45:26.181021411 +0000 UTC m=+1048.051256905" watchObservedRunningTime="2026-01-10 16:45:26.182695149 +0000 UTC m=+1048.052930653" Jan 10 16:45:27 crc kubenswrapper[5036]: I0110 16:45:27.176616 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerStarted","Data":"d28b27960f834840be7757d03723d2d7badcd48dee80eda66e746096741e71be"} Jan 10 16:45:28 crc kubenswrapper[5036]: I0110 16:45:28.783399 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 10 16:45:29 crc kubenswrapper[5036]: I0110 16:45:29.195318 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c69521dc-96b5-46cb-b203-2a798dc9113e","Type":"ContainerStarted","Data":"313ab2a141323e3c407b4fac95caafac5ba599b4745b7b1ae37b5780c72462e2"} Jan 10 16:45:29 crc kubenswrapper[5036]: I0110 16:45:29.195464 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c69521dc-96b5-46cb-b203-2a798dc9113e" containerName="ceilometer-central-agent" containerID="cri-o://677283f00f05537ef651ded70a1acc0d9d57158284cc9b6e66131727c00e939e" gracePeriod=30 Jan 10 16:45:29 crc kubenswrapper[5036]: I0110 16:45:29.195527 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 10 16:45:29 crc kubenswrapper[5036]: I0110 16:45:29.195865 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c69521dc-96b5-46cb-b203-2a798dc9113e" containerName="proxy-httpd" containerID="cri-o://313ab2a141323e3c407b4fac95caafac5ba599b4745b7b1ae37b5780c72462e2" gracePeriod=30 Jan 10 16:45:29 crc kubenswrapper[5036]: I0110 16:45:29.195912 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c69521dc-96b5-46cb-b203-2a798dc9113e" containerName="sg-core" containerID="cri-o://b5bb66042ea7e8c433b64716b5beb169e194528b20dd1cac82c971d54b1eb80e" gracePeriod=30 Jan 10 16:45:29 crc kubenswrapper[5036]: I0110 16:45:29.195950 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c69521dc-96b5-46cb-b203-2a798dc9113e" containerName="ceilometer-notification-agent" containerID="cri-o://39503cc2344c82f88559e4a715f3e1417099004d8f9a600a3fe2d99494d80504" gracePeriod=30 Jan 10 16:45:29 crc kubenswrapper[5036]: I0110 16:45:29.228373 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.071459812 podStartE2EDuration="12.228352759s" podCreationTimestamp="2026-01-10 16:45:17 +0000 UTC" firstStartedPulling="2026-01-10 16:45:17.983142725 +0000 UTC m=+1039.853378209" lastFinishedPulling="2026-01-10 16:45:28.140035652 +0000 UTC m=+1050.010271156" observedRunningTime="2026-01-10 16:45:29.21998154 +0000 UTC m=+1051.090217044" watchObservedRunningTime="2026-01-10 16:45:29.228352759 +0000 UTC m=+1051.098588253" Jan 10 16:45:30 crc kubenswrapper[5036]: I0110 16:45:30.208152 5036 generic.go:334] "Generic (PLEG): container finished" podID="c69521dc-96b5-46cb-b203-2a798dc9113e" containerID="313ab2a141323e3c407b4fac95caafac5ba599b4745b7b1ae37b5780c72462e2" exitCode=0 Jan 10 16:45:30 crc kubenswrapper[5036]: I0110 16:45:30.209626 5036 generic.go:334] "Generic (PLEG): container finished" podID="c69521dc-96b5-46cb-b203-2a798dc9113e" containerID="b5bb66042ea7e8c433b64716b5beb169e194528b20dd1cac82c971d54b1eb80e" exitCode=2 Jan 10 16:45:30 crc kubenswrapper[5036]: I0110 16:45:30.209759 5036 generic.go:334] "Generic (PLEG): container finished" podID="c69521dc-96b5-46cb-b203-2a798dc9113e" containerID="677283f00f05537ef651ded70a1acc0d9d57158284cc9b6e66131727c00e939e" exitCode=0 Jan 10 16:45:30 crc kubenswrapper[5036]: I0110 16:45:30.208181 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c69521dc-96b5-46cb-b203-2a798dc9113e","Type":"ContainerDied","Data":"313ab2a141323e3c407b4fac95caafac5ba599b4745b7b1ae37b5780c72462e2"} Jan 10 16:45:30 crc kubenswrapper[5036]: I0110 16:45:30.209950 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c69521dc-96b5-46cb-b203-2a798dc9113e","Type":"ContainerDied","Data":"b5bb66042ea7e8c433b64716b5beb169e194528b20dd1cac82c971d54b1eb80e"} Jan 10 16:45:30 crc kubenswrapper[5036]: I0110 16:45:30.210044 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c69521dc-96b5-46cb-b203-2a798dc9113e","Type":"ContainerDied","Data":"677283f00f05537ef651ded70a1acc0d9d57158284cc9b6e66131727c00e939e"} Jan 10 16:45:30 crc kubenswrapper[5036]: I0110 16:45:30.892792 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.048280 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c69521dc-96b5-46cb-b203-2a798dc9113e-run-httpd\") pod \"c69521dc-96b5-46cb-b203-2a798dc9113e\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.048374 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c69521dc-96b5-46cb-b203-2a798dc9113e-config-data\") pod \"c69521dc-96b5-46cb-b203-2a798dc9113e\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.048402 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c69521dc-96b5-46cb-b203-2a798dc9113e-combined-ca-bundle\") pod \"c69521dc-96b5-46cb-b203-2a798dc9113e\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.048519 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c69521dc-96b5-46cb-b203-2a798dc9113e-log-httpd\") pod \"c69521dc-96b5-46cb-b203-2a798dc9113e\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.048601 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8vnk\" (UniqueName: \"kubernetes.io/projected/c69521dc-96b5-46cb-b203-2a798dc9113e-kube-api-access-p8vnk\") pod \"c69521dc-96b5-46cb-b203-2a798dc9113e\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.048652 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c69521dc-96b5-46cb-b203-2a798dc9113e-sg-core-conf-yaml\") pod \"c69521dc-96b5-46cb-b203-2a798dc9113e\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.048743 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c69521dc-96b5-46cb-b203-2a798dc9113e-scripts\") pod \"c69521dc-96b5-46cb-b203-2a798dc9113e\" (UID: \"c69521dc-96b5-46cb-b203-2a798dc9113e\") " Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.048943 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c69521dc-96b5-46cb-b203-2a798dc9113e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c69521dc-96b5-46cb-b203-2a798dc9113e" (UID: "c69521dc-96b5-46cb-b203-2a798dc9113e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.049521 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c69521dc-96b5-46cb-b203-2a798dc9113e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c69521dc-96b5-46cb-b203-2a798dc9113e" (UID: "c69521dc-96b5-46cb-b203-2a798dc9113e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.054793 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c69521dc-96b5-46cb-b203-2a798dc9113e-kube-api-access-p8vnk" (OuterVolumeSpecName: "kube-api-access-p8vnk") pod "c69521dc-96b5-46cb-b203-2a798dc9113e" (UID: "c69521dc-96b5-46cb-b203-2a798dc9113e"). InnerVolumeSpecName "kube-api-access-p8vnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.063054 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c69521dc-96b5-46cb-b203-2a798dc9113e-scripts" (OuterVolumeSpecName: "scripts") pod "c69521dc-96b5-46cb-b203-2a798dc9113e" (UID: "c69521dc-96b5-46cb-b203-2a798dc9113e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.082636 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c69521dc-96b5-46cb-b203-2a798dc9113e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c69521dc-96b5-46cb-b203-2a798dc9113e" (UID: "c69521dc-96b5-46cb-b203-2a798dc9113e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.155014 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8vnk\" (UniqueName: \"kubernetes.io/projected/c69521dc-96b5-46cb-b203-2a798dc9113e-kube-api-access-p8vnk\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.155055 5036 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c69521dc-96b5-46cb-b203-2a798dc9113e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.155068 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c69521dc-96b5-46cb-b203-2a798dc9113e-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.155081 5036 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c69521dc-96b5-46cb-b203-2a798dc9113e-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.155091 5036 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c69521dc-96b5-46cb-b203-2a798dc9113e-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.155819 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c69521dc-96b5-46cb-b203-2a798dc9113e-config-data" (OuterVolumeSpecName: "config-data") pod "c69521dc-96b5-46cb-b203-2a798dc9113e" (UID: "c69521dc-96b5-46cb-b203-2a798dc9113e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.188248 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c69521dc-96b5-46cb-b203-2a798dc9113e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c69521dc-96b5-46cb-b203-2a798dc9113e" (UID: "c69521dc-96b5-46cb-b203-2a798dc9113e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.219776 5036 generic.go:334] "Generic (PLEG): container finished" podID="c69521dc-96b5-46cb-b203-2a798dc9113e" containerID="39503cc2344c82f88559e4a715f3e1417099004d8f9a600a3fe2d99494d80504" exitCode=0 Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.219816 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c69521dc-96b5-46cb-b203-2a798dc9113e","Type":"ContainerDied","Data":"39503cc2344c82f88559e4a715f3e1417099004d8f9a600a3fe2d99494d80504"} Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.219841 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c69521dc-96b5-46cb-b203-2a798dc9113e","Type":"ContainerDied","Data":"35e6ea63c9f07fb4ca66c6c9a0e365c6991757ef45e454449caa44f5c31c37c5"} Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.219857 5036 scope.go:117] "RemoveContainer" containerID="313ab2a141323e3c407b4fac95caafac5ba599b4745b7b1ae37b5780c72462e2" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.219991 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.254670 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.256912 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c69521dc-96b5-46cb-b203-2a798dc9113e-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.256935 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c69521dc-96b5-46cb-b203-2a798dc9113e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.264349 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.265163 5036 scope.go:117] "RemoveContainer" containerID="b5bb66042ea7e8c433b64716b5beb169e194528b20dd1cac82c971d54b1eb80e" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.312038 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:45:31 crc kubenswrapper[5036]: E0110 16:45:31.312508 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69521dc-96b5-46cb-b203-2a798dc9113e" containerName="proxy-httpd" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.312533 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69521dc-96b5-46cb-b203-2a798dc9113e" containerName="proxy-httpd" Jan 10 16:45:31 crc kubenswrapper[5036]: E0110 16:45:31.312566 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69521dc-96b5-46cb-b203-2a798dc9113e" containerName="ceilometer-notification-agent" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.312575 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69521dc-96b5-46cb-b203-2a798dc9113e" containerName="ceilometer-notification-agent" Jan 10 16:45:31 crc kubenswrapper[5036]: E0110 16:45:31.312586 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69521dc-96b5-46cb-b203-2a798dc9113e" containerName="ceilometer-central-agent" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.312594 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69521dc-96b5-46cb-b203-2a798dc9113e" containerName="ceilometer-central-agent" Jan 10 16:45:31 crc kubenswrapper[5036]: E0110 16:45:31.312624 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69521dc-96b5-46cb-b203-2a798dc9113e" containerName="sg-core" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.312631 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69521dc-96b5-46cb-b203-2a798dc9113e" containerName="sg-core" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.312845 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="c69521dc-96b5-46cb-b203-2a798dc9113e" containerName="ceilometer-notification-agent" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.312863 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="c69521dc-96b5-46cb-b203-2a798dc9113e" containerName="sg-core" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.312875 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="c69521dc-96b5-46cb-b203-2a798dc9113e" containerName="ceilometer-central-agent" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.312894 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="c69521dc-96b5-46cb-b203-2a798dc9113e" containerName="proxy-httpd" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.322415 5036 scope.go:117] "RemoveContainer" containerID="39503cc2344c82f88559e4a715f3e1417099004d8f9a600a3fe2d99494d80504" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.354409 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.359147 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.359265 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.359542 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.394757 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.395397 5036 scope.go:117] "RemoveContainer" containerID="677283f00f05537ef651ded70a1acc0d9d57158284cc9b6e66131727c00e939e" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.461072 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.461139 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-scripts\") pod \"ceilometer-0\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.461167 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.461213 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjgh5\" (UniqueName: \"kubernetes.io/projected/5381a5fe-3732-4b03-8bed-b644a2070536-kube-api-access-wjgh5\") pod \"ceilometer-0\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.461280 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-config-data\") pod \"ceilometer-0\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.461310 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.461384 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5381a5fe-3732-4b03-8bed-b644a2070536-run-httpd\") pod \"ceilometer-0\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.461409 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5381a5fe-3732-4b03-8bed-b644a2070536-log-httpd\") pod \"ceilometer-0\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.466296 5036 scope.go:117] "RemoveContainer" containerID="313ab2a141323e3c407b4fac95caafac5ba599b4745b7b1ae37b5780c72462e2" Jan 10 16:45:31 crc kubenswrapper[5036]: E0110 16:45:31.467076 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"313ab2a141323e3c407b4fac95caafac5ba599b4745b7b1ae37b5780c72462e2\": container with ID starting with 313ab2a141323e3c407b4fac95caafac5ba599b4745b7b1ae37b5780c72462e2 not found: ID does not exist" containerID="313ab2a141323e3c407b4fac95caafac5ba599b4745b7b1ae37b5780c72462e2" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.467122 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"313ab2a141323e3c407b4fac95caafac5ba599b4745b7b1ae37b5780c72462e2"} err="failed to get container status \"313ab2a141323e3c407b4fac95caafac5ba599b4745b7b1ae37b5780c72462e2\": rpc error: code = NotFound desc = could not find container \"313ab2a141323e3c407b4fac95caafac5ba599b4745b7b1ae37b5780c72462e2\": container with ID starting with 313ab2a141323e3c407b4fac95caafac5ba599b4745b7b1ae37b5780c72462e2 not found: ID does not exist" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.467151 5036 scope.go:117] "RemoveContainer" containerID="b5bb66042ea7e8c433b64716b5beb169e194528b20dd1cac82c971d54b1eb80e" Jan 10 16:45:31 crc kubenswrapper[5036]: E0110 16:45:31.467528 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5bb66042ea7e8c433b64716b5beb169e194528b20dd1cac82c971d54b1eb80e\": container with ID starting with b5bb66042ea7e8c433b64716b5beb169e194528b20dd1cac82c971d54b1eb80e not found: ID does not exist" containerID="b5bb66042ea7e8c433b64716b5beb169e194528b20dd1cac82c971d54b1eb80e" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.467556 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5bb66042ea7e8c433b64716b5beb169e194528b20dd1cac82c971d54b1eb80e"} err="failed to get container status \"b5bb66042ea7e8c433b64716b5beb169e194528b20dd1cac82c971d54b1eb80e\": rpc error: code = NotFound desc = could not find container \"b5bb66042ea7e8c433b64716b5beb169e194528b20dd1cac82c971d54b1eb80e\": container with ID starting with b5bb66042ea7e8c433b64716b5beb169e194528b20dd1cac82c971d54b1eb80e not found: ID does not exist" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.467572 5036 scope.go:117] "RemoveContainer" containerID="39503cc2344c82f88559e4a715f3e1417099004d8f9a600a3fe2d99494d80504" Jan 10 16:45:31 crc kubenswrapper[5036]: E0110 16:45:31.467990 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39503cc2344c82f88559e4a715f3e1417099004d8f9a600a3fe2d99494d80504\": container with ID starting with 39503cc2344c82f88559e4a715f3e1417099004d8f9a600a3fe2d99494d80504 not found: ID does not exist" containerID="39503cc2344c82f88559e4a715f3e1417099004d8f9a600a3fe2d99494d80504" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.468009 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39503cc2344c82f88559e4a715f3e1417099004d8f9a600a3fe2d99494d80504"} err="failed to get container status \"39503cc2344c82f88559e4a715f3e1417099004d8f9a600a3fe2d99494d80504\": rpc error: code = NotFound desc = could not find container \"39503cc2344c82f88559e4a715f3e1417099004d8f9a600a3fe2d99494d80504\": container with ID starting with 39503cc2344c82f88559e4a715f3e1417099004d8f9a600a3fe2d99494d80504 not found: ID does not exist" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.468025 5036 scope.go:117] "RemoveContainer" containerID="677283f00f05537ef651ded70a1acc0d9d57158284cc9b6e66131727c00e939e" Jan 10 16:45:31 crc kubenswrapper[5036]: E0110 16:45:31.468310 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"677283f00f05537ef651ded70a1acc0d9d57158284cc9b6e66131727c00e939e\": container with ID starting with 677283f00f05537ef651ded70a1acc0d9d57158284cc9b6e66131727c00e939e not found: ID does not exist" containerID="677283f00f05537ef651ded70a1acc0d9d57158284cc9b6e66131727c00e939e" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.468333 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"677283f00f05537ef651ded70a1acc0d9d57158284cc9b6e66131727c00e939e"} err="failed to get container status \"677283f00f05537ef651ded70a1acc0d9d57158284cc9b6e66131727c00e939e\": rpc error: code = NotFound desc = could not find container \"677283f00f05537ef651ded70a1acc0d9d57158284cc9b6e66131727c00e939e\": container with ID starting with 677283f00f05537ef651ded70a1acc0d9d57158284cc9b6e66131727c00e939e not found: ID does not exist" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.562765 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.562816 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-scripts\") pod \"ceilometer-0\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.562842 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.562883 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjgh5\" (UniqueName: \"kubernetes.io/projected/5381a5fe-3732-4b03-8bed-b644a2070536-kube-api-access-wjgh5\") pod \"ceilometer-0\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.562943 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-config-data\") pod \"ceilometer-0\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.562970 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.563061 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5381a5fe-3732-4b03-8bed-b644a2070536-run-httpd\") pod \"ceilometer-0\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.563080 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5381a5fe-3732-4b03-8bed-b644a2070536-log-httpd\") pod \"ceilometer-0\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.563815 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5381a5fe-3732-4b03-8bed-b644a2070536-run-httpd\") pod \"ceilometer-0\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.563949 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5381a5fe-3732-4b03-8bed-b644a2070536-log-httpd\") pod \"ceilometer-0\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.567301 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.567742 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.567824 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.568704 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-config-data\") pod \"ceilometer-0\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.572844 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-scripts\") pod \"ceilometer-0\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.581907 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjgh5\" (UniqueName: \"kubernetes.io/projected/5381a5fe-3732-4b03-8bed-b644a2070536-kube-api-access-wjgh5\") pod \"ceilometer-0\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " pod="openstack/ceilometer-0" Jan 10 16:45:31 crc kubenswrapper[5036]: I0110 16:45:31.698611 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 16:45:32 crc kubenswrapper[5036]: I0110 16:45:32.147912 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:45:32 crc kubenswrapper[5036]: W0110 16:45:32.150861 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5381a5fe_3732_4b03_8bed_b644a2070536.slice/crio-f53b5aa3d4b777c087c98bb9613834673f4da9552fe219272db37ba9fd86ac01 WatchSource:0}: Error finding container f53b5aa3d4b777c087c98bb9613834673f4da9552fe219272db37ba9fd86ac01: Status 404 returned error can't find the container with id f53b5aa3d4b777c087c98bb9613834673f4da9552fe219272db37ba9fd86ac01 Jan 10 16:45:32 crc kubenswrapper[5036]: I0110 16:45:32.230606 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5381a5fe-3732-4b03-8bed-b644a2070536","Type":"ContainerStarted","Data":"f53b5aa3d4b777c087c98bb9613834673f4da9552fe219272db37ba9fd86ac01"} Jan 10 16:45:32 crc kubenswrapper[5036]: I0110 16:45:32.519574 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c69521dc-96b5-46cb-b203-2a798dc9113e" path="/var/lib/kubelet/pods/c69521dc-96b5-46cb-b203-2a798dc9113e/volumes" Jan 10 16:45:32 crc kubenswrapper[5036]: I0110 16:45:32.813076 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:45:33 crc kubenswrapper[5036]: I0110 16:45:33.241078 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5381a5fe-3732-4b03-8bed-b644a2070536","Type":"ContainerStarted","Data":"969308f26ac349ae4d016287034725f0a62b7dbe89e355714b4770803890f547"} Jan 10 16:45:34 crc kubenswrapper[5036]: I0110 16:45:34.269917 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5381a5fe-3732-4b03-8bed-b644a2070536","Type":"ContainerStarted","Data":"ea7758ab3d0fd620975cdd9808365972c81ed22c5f88ee25d253c928a7a27b0e"} Jan 10 16:45:35 crc kubenswrapper[5036]: I0110 16:45:35.281931 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5381a5fe-3732-4b03-8bed-b644a2070536","Type":"ContainerStarted","Data":"a2dff467e93cff5a309840ea4ff3d585bc1ca22135c13255036ee3acc77ee863"} Jan 10 16:45:36 crc kubenswrapper[5036]: I0110 16:45:36.370120 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5381a5fe-3732-4b03-8bed-b644a2070536","Type":"ContainerStarted","Data":"70c48ef720eecca108c1db3f01cd106e3b12f092f572cb078c0947b7d750743e"} Jan 10 16:45:36 crc kubenswrapper[5036]: I0110 16:45:36.370482 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5381a5fe-3732-4b03-8bed-b644a2070536" containerName="ceilometer-central-agent" containerID="cri-o://969308f26ac349ae4d016287034725f0a62b7dbe89e355714b4770803890f547" gracePeriod=30 Jan 10 16:45:36 crc kubenswrapper[5036]: I0110 16:45:36.370814 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5381a5fe-3732-4b03-8bed-b644a2070536" containerName="proxy-httpd" containerID="cri-o://70c48ef720eecca108c1db3f01cd106e3b12f092f572cb078c0947b7d750743e" gracePeriod=30 Jan 10 16:45:36 crc kubenswrapper[5036]: I0110 16:45:36.370827 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 10 16:45:36 crc kubenswrapper[5036]: I0110 16:45:36.370860 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5381a5fe-3732-4b03-8bed-b644a2070536" containerName="sg-core" containerID="cri-o://a2dff467e93cff5a309840ea4ff3d585bc1ca22135c13255036ee3acc77ee863" gracePeriod=30 Jan 10 16:45:36 crc kubenswrapper[5036]: I0110 16:45:36.370893 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5381a5fe-3732-4b03-8bed-b644a2070536" containerName="ceilometer-notification-agent" containerID="cri-o://ea7758ab3d0fd620975cdd9808365972c81ed22c5f88ee25d253c928a7a27b0e" gracePeriod=30 Jan 10 16:45:36 crc kubenswrapper[5036]: I0110 16:45:36.440778 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.841260057 podStartE2EDuration="5.440760804s" podCreationTimestamp="2026-01-10 16:45:31 +0000 UTC" firstStartedPulling="2026-01-10 16:45:32.153630672 +0000 UTC m=+1054.023866166" lastFinishedPulling="2026-01-10 16:45:35.753131419 +0000 UTC m=+1057.623366913" observedRunningTime="2026-01-10 16:45:36.418924401 +0000 UTC m=+1058.289159895" watchObservedRunningTime="2026-01-10 16:45:36.440760804 +0000 UTC m=+1058.310996298" Jan 10 16:45:37 crc kubenswrapper[5036]: I0110 16:45:37.381103 5036 generic.go:334] "Generic (PLEG): container finished" podID="5381a5fe-3732-4b03-8bed-b644a2070536" containerID="70c48ef720eecca108c1db3f01cd106e3b12f092f572cb078c0947b7d750743e" exitCode=0 Jan 10 16:45:37 crc kubenswrapper[5036]: I0110 16:45:37.381172 5036 generic.go:334] "Generic (PLEG): container finished" podID="5381a5fe-3732-4b03-8bed-b644a2070536" containerID="a2dff467e93cff5a309840ea4ff3d585bc1ca22135c13255036ee3acc77ee863" exitCode=2 Jan 10 16:45:37 crc kubenswrapper[5036]: I0110 16:45:37.381167 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5381a5fe-3732-4b03-8bed-b644a2070536","Type":"ContainerDied","Data":"70c48ef720eecca108c1db3f01cd106e3b12f092f572cb078c0947b7d750743e"} Jan 10 16:45:37 crc kubenswrapper[5036]: I0110 16:45:37.381232 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5381a5fe-3732-4b03-8bed-b644a2070536","Type":"ContainerDied","Data":"a2dff467e93cff5a309840ea4ff3d585bc1ca22135c13255036ee3acc77ee863"} Jan 10 16:45:37 crc kubenswrapper[5036]: I0110 16:45:37.381252 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5381a5fe-3732-4b03-8bed-b644a2070536","Type":"ContainerDied","Data":"ea7758ab3d0fd620975cdd9808365972c81ed22c5f88ee25d253c928a7a27b0e"} Jan 10 16:45:37 crc kubenswrapper[5036]: I0110 16:45:37.381193 5036 generic.go:334] "Generic (PLEG): container finished" podID="5381a5fe-3732-4b03-8bed-b644a2070536" containerID="ea7758ab3d0fd620975cdd9808365972c81ed22c5f88ee25d253c928a7a27b0e" exitCode=0 Jan 10 16:45:38 crc kubenswrapper[5036]: I0110 16:45:38.393319 5036 generic.go:334] "Generic (PLEG): container finished" podID="3ff51de8-7e61-4799-b5fc-24e294ec8050" containerID="5083d291349a6cf55709db754c43d52d0f05538b981c428264263991423167a1" exitCode=0 Jan 10 16:45:38 crc kubenswrapper[5036]: I0110 16:45:38.393394 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pqkrg" event={"ID":"3ff51de8-7e61-4799-b5fc-24e294ec8050","Type":"ContainerDied","Data":"5083d291349a6cf55709db754c43d52d0f05538b981c428264263991423167a1"} Jan 10 16:45:39 crc kubenswrapper[5036]: I0110 16:45:39.726248 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pqkrg" Jan 10 16:45:39 crc kubenswrapper[5036]: I0110 16:45:39.869654 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ff51de8-7e61-4799-b5fc-24e294ec8050-scripts\") pod \"3ff51de8-7e61-4799-b5fc-24e294ec8050\" (UID: \"3ff51de8-7e61-4799-b5fc-24e294ec8050\") " Jan 10 16:45:39 crc kubenswrapper[5036]: I0110 16:45:39.869838 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rc27\" (UniqueName: \"kubernetes.io/projected/3ff51de8-7e61-4799-b5fc-24e294ec8050-kube-api-access-5rc27\") pod \"3ff51de8-7e61-4799-b5fc-24e294ec8050\" (UID: \"3ff51de8-7e61-4799-b5fc-24e294ec8050\") " Jan 10 16:45:39 crc kubenswrapper[5036]: I0110 16:45:39.869926 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff51de8-7e61-4799-b5fc-24e294ec8050-combined-ca-bundle\") pod \"3ff51de8-7e61-4799-b5fc-24e294ec8050\" (UID: \"3ff51de8-7e61-4799-b5fc-24e294ec8050\") " Jan 10 16:45:39 crc kubenswrapper[5036]: I0110 16:45:39.869990 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff51de8-7e61-4799-b5fc-24e294ec8050-config-data\") pod \"3ff51de8-7e61-4799-b5fc-24e294ec8050\" (UID: \"3ff51de8-7e61-4799-b5fc-24e294ec8050\") " Jan 10 16:45:39 crc kubenswrapper[5036]: I0110 16:45:39.877726 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ff51de8-7e61-4799-b5fc-24e294ec8050-kube-api-access-5rc27" (OuterVolumeSpecName: "kube-api-access-5rc27") pod "3ff51de8-7e61-4799-b5fc-24e294ec8050" (UID: "3ff51de8-7e61-4799-b5fc-24e294ec8050"). InnerVolumeSpecName "kube-api-access-5rc27". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:45:39 crc kubenswrapper[5036]: I0110 16:45:39.878417 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ff51de8-7e61-4799-b5fc-24e294ec8050-scripts" (OuterVolumeSpecName: "scripts") pod "3ff51de8-7e61-4799-b5fc-24e294ec8050" (UID: "3ff51de8-7e61-4799-b5fc-24e294ec8050"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:39 crc kubenswrapper[5036]: E0110 16:45:39.896068 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ff51de8-7e61-4799-b5fc-24e294ec8050-combined-ca-bundle podName:3ff51de8-7e61-4799-b5fc-24e294ec8050 nodeName:}" failed. No retries permitted until 2026-01-10 16:45:40.396033662 +0000 UTC m=+1062.266269156 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/3ff51de8-7e61-4799-b5fc-24e294ec8050-combined-ca-bundle") pod "3ff51de8-7e61-4799-b5fc-24e294ec8050" (UID: "3ff51de8-7e61-4799-b5fc-24e294ec8050") : error deleting /var/lib/kubelet/pods/3ff51de8-7e61-4799-b5fc-24e294ec8050/volume-subpaths: remove /var/lib/kubelet/pods/3ff51de8-7e61-4799-b5fc-24e294ec8050/volume-subpaths: no such file or directory Jan 10 16:45:39 crc kubenswrapper[5036]: I0110 16:45:39.900753 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ff51de8-7e61-4799-b5fc-24e294ec8050-config-data" (OuterVolumeSpecName: "config-data") pod "3ff51de8-7e61-4799-b5fc-24e294ec8050" (UID: "3ff51de8-7e61-4799-b5fc-24e294ec8050"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:39 crc kubenswrapper[5036]: I0110 16:45:39.972496 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff51de8-7e61-4799-b5fc-24e294ec8050-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:39 crc kubenswrapper[5036]: I0110 16:45:39.972537 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ff51de8-7e61-4799-b5fc-24e294ec8050-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:39 crc kubenswrapper[5036]: I0110 16:45:39.972546 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rc27\" (UniqueName: \"kubernetes.io/projected/3ff51de8-7e61-4799-b5fc-24e294ec8050-kube-api-access-5rc27\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:40 crc kubenswrapper[5036]: I0110 16:45:40.408801 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pqkrg" event={"ID":"3ff51de8-7e61-4799-b5fc-24e294ec8050","Type":"ContainerDied","Data":"b8ef109761eeee9dca5c8717d655d989956391a61a36b36305aa142fa5af882b"} Jan 10 16:45:40 crc kubenswrapper[5036]: I0110 16:45:40.409212 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8ef109761eeee9dca5c8717d655d989956391a61a36b36305aa142fa5af882b" Jan 10 16:45:40 crc kubenswrapper[5036]: I0110 16:45:40.408871 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pqkrg" Jan 10 16:45:40 crc kubenswrapper[5036]: I0110 16:45:40.422436 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff51de8-7e61-4799-b5fc-24e294ec8050-combined-ca-bundle\") pod \"3ff51de8-7e61-4799-b5fc-24e294ec8050\" (UID: \"3ff51de8-7e61-4799-b5fc-24e294ec8050\") " Jan 10 16:45:40 crc kubenswrapper[5036]: I0110 16:45:40.434895 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ff51de8-7e61-4799-b5fc-24e294ec8050-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ff51de8-7e61-4799-b5fc-24e294ec8050" (UID: "3ff51de8-7e61-4799-b5fc-24e294ec8050"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:40 crc kubenswrapper[5036]: I0110 16:45:40.525728 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff51de8-7e61-4799-b5fc-24e294ec8050-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:40 crc kubenswrapper[5036]: I0110 16:45:40.541589 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 10 16:45:40 crc kubenswrapper[5036]: E0110 16:45:40.542410 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ff51de8-7e61-4799-b5fc-24e294ec8050" containerName="nova-cell0-conductor-db-sync" Jan 10 16:45:40 crc kubenswrapper[5036]: I0110 16:45:40.546539 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ff51de8-7e61-4799-b5fc-24e294ec8050" containerName="nova-cell0-conductor-db-sync" Jan 10 16:45:40 crc kubenswrapper[5036]: I0110 16:45:40.547062 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ff51de8-7e61-4799-b5fc-24e294ec8050" containerName="nova-cell0-conductor-db-sync" Jan 10 16:45:40 crc kubenswrapper[5036]: I0110 16:45:40.547865 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 10 16:45:40 crc kubenswrapper[5036]: I0110 16:45:40.563848 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 10 16:45:40 crc kubenswrapper[5036]: I0110 16:45:40.729605 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e98e9d9c-f90a-44da-9b67-2dadaf5b24b3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e98e9d9c-f90a-44da-9b67-2dadaf5b24b3\") " pod="openstack/nova-cell0-conductor-0" Jan 10 16:45:40 crc kubenswrapper[5036]: I0110 16:45:40.730622 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2gpk\" (UniqueName: \"kubernetes.io/projected/e98e9d9c-f90a-44da-9b67-2dadaf5b24b3-kube-api-access-l2gpk\") pod \"nova-cell0-conductor-0\" (UID: \"e98e9d9c-f90a-44da-9b67-2dadaf5b24b3\") " pod="openstack/nova-cell0-conductor-0" Jan 10 16:45:40 crc kubenswrapper[5036]: I0110 16:45:40.730877 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e98e9d9c-f90a-44da-9b67-2dadaf5b24b3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e98e9d9c-f90a-44da-9b67-2dadaf5b24b3\") " pod="openstack/nova-cell0-conductor-0" Jan 10 16:45:40 crc kubenswrapper[5036]: I0110 16:45:40.833362 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e98e9d9c-f90a-44da-9b67-2dadaf5b24b3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e98e9d9c-f90a-44da-9b67-2dadaf5b24b3\") " pod="openstack/nova-cell0-conductor-0" Jan 10 16:45:40 crc kubenswrapper[5036]: I0110 16:45:40.833456 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e98e9d9c-f90a-44da-9b67-2dadaf5b24b3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e98e9d9c-f90a-44da-9b67-2dadaf5b24b3\") " pod="openstack/nova-cell0-conductor-0" Jan 10 16:45:40 crc kubenswrapper[5036]: I0110 16:45:40.833519 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2gpk\" (UniqueName: \"kubernetes.io/projected/e98e9d9c-f90a-44da-9b67-2dadaf5b24b3-kube-api-access-l2gpk\") pod \"nova-cell0-conductor-0\" (UID: \"e98e9d9c-f90a-44da-9b67-2dadaf5b24b3\") " pod="openstack/nova-cell0-conductor-0" Jan 10 16:45:40 crc kubenswrapper[5036]: I0110 16:45:40.839352 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e98e9d9c-f90a-44da-9b67-2dadaf5b24b3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e98e9d9c-f90a-44da-9b67-2dadaf5b24b3\") " pod="openstack/nova-cell0-conductor-0" Jan 10 16:45:40 crc kubenswrapper[5036]: I0110 16:45:40.857003 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2gpk\" (UniqueName: \"kubernetes.io/projected/e98e9d9c-f90a-44da-9b67-2dadaf5b24b3-kube-api-access-l2gpk\") pod \"nova-cell0-conductor-0\" (UID: \"e98e9d9c-f90a-44da-9b67-2dadaf5b24b3\") " pod="openstack/nova-cell0-conductor-0" Jan 10 16:45:40 crc kubenswrapper[5036]: I0110 16:45:40.865346 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e98e9d9c-f90a-44da-9b67-2dadaf5b24b3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e98e9d9c-f90a-44da-9b67-2dadaf5b24b3\") " pod="openstack/nova-cell0-conductor-0" Jan 10 16:45:40 crc kubenswrapper[5036]: I0110 16:45:40.882425 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.077645 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.196941 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 10 16:45:41 crc kubenswrapper[5036]: W0110 16:45:41.199093 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode98e9d9c_f90a_44da_9b67_2dadaf5b24b3.slice/crio-f0230464a0e060987bf4085bd9cfe6ae2712936090c039c84e1206691187e95b WatchSource:0}: Error finding container f0230464a0e060987bf4085bd9cfe6ae2712936090c039c84e1206691187e95b: Status 404 returned error can't find the container with id f0230464a0e060987bf4085bd9cfe6ae2712936090c039c84e1206691187e95b Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.249254 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjgh5\" (UniqueName: \"kubernetes.io/projected/5381a5fe-3732-4b03-8bed-b644a2070536-kube-api-access-wjgh5\") pod \"5381a5fe-3732-4b03-8bed-b644a2070536\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.249571 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-config-data\") pod \"5381a5fe-3732-4b03-8bed-b644a2070536\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.249736 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-combined-ca-bundle\") pod \"5381a5fe-3732-4b03-8bed-b644a2070536\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.249853 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-sg-core-conf-yaml\") pod \"5381a5fe-3732-4b03-8bed-b644a2070536\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.250010 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5381a5fe-3732-4b03-8bed-b644a2070536-log-httpd\") pod \"5381a5fe-3732-4b03-8bed-b644a2070536\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.250079 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5381a5fe-3732-4b03-8bed-b644a2070536-run-httpd\") pod \"5381a5fe-3732-4b03-8bed-b644a2070536\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.250186 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-ceilometer-tls-certs\") pod \"5381a5fe-3732-4b03-8bed-b644a2070536\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.250257 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-scripts\") pod \"5381a5fe-3732-4b03-8bed-b644a2070536\" (UID: \"5381a5fe-3732-4b03-8bed-b644a2070536\") " Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.251173 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5381a5fe-3732-4b03-8bed-b644a2070536-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5381a5fe-3732-4b03-8bed-b644a2070536" (UID: "5381a5fe-3732-4b03-8bed-b644a2070536"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.251430 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5381a5fe-3732-4b03-8bed-b644a2070536-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5381a5fe-3732-4b03-8bed-b644a2070536" (UID: "5381a5fe-3732-4b03-8bed-b644a2070536"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.254339 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5381a5fe-3732-4b03-8bed-b644a2070536-kube-api-access-wjgh5" (OuterVolumeSpecName: "kube-api-access-wjgh5") pod "5381a5fe-3732-4b03-8bed-b644a2070536" (UID: "5381a5fe-3732-4b03-8bed-b644a2070536"). InnerVolumeSpecName "kube-api-access-wjgh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.255201 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-scripts" (OuterVolumeSpecName: "scripts") pod "5381a5fe-3732-4b03-8bed-b644a2070536" (UID: "5381a5fe-3732-4b03-8bed-b644a2070536"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.284103 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5381a5fe-3732-4b03-8bed-b644a2070536" (UID: "5381a5fe-3732-4b03-8bed-b644a2070536"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.311085 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5381a5fe-3732-4b03-8bed-b644a2070536" (UID: "5381a5fe-3732-4b03-8bed-b644a2070536"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.349487 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5381a5fe-3732-4b03-8bed-b644a2070536" (UID: "5381a5fe-3732-4b03-8bed-b644a2070536"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.352591 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.352756 5036 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.352835 5036 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5381a5fe-3732-4b03-8bed-b644a2070536-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.352908 5036 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5381a5fe-3732-4b03-8bed-b644a2070536-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.352969 5036 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.353028 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.353085 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjgh5\" (UniqueName: \"kubernetes.io/projected/5381a5fe-3732-4b03-8bed-b644a2070536-kube-api-access-wjgh5\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.361082 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-config-data" (OuterVolumeSpecName: "config-data") pod "5381a5fe-3732-4b03-8bed-b644a2070536" (UID: "5381a5fe-3732-4b03-8bed-b644a2070536"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.418233 5036 generic.go:334] "Generic (PLEG): container finished" podID="5381a5fe-3732-4b03-8bed-b644a2070536" containerID="969308f26ac349ae4d016287034725f0a62b7dbe89e355714b4770803890f547" exitCode=0 Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.418298 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5381a5fe-3732-4b03-8bed-b644a2070536","Type":"ContainerDied","Data":"969308f26ac349ae4d016287034725f0a62b7dbe89e355714b4770803890f547"} Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.418312 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.418338 5036 scope.go:117] "RemoveContainer" containerID="70c48ef720eecca108c1db3f01cd106e3b12f092f572cb078c0947b7d750743e" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.418325 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5381a5fe-3732-4b03-8bed-b644a2070536","Type":"ContainerDied","Data":"f53b5aa3d4b777c087c98bb9613834673f4da9552fe219272db37ba9fd86ac01"} Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.421511 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e98e9d9c-f90a-44da-9b67-2dadaf5b24b3","Type":"ContainerStarted","Data":"228e114f089ef18a748ea3b5d42214542a3471d2f34a10d04b1ab4447228f451"} Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.421589 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e98e9d9c-f90a-44da-9b67-2dadaf5b24b3","Type":"ContainerStarted","Data":"f0230464a0e060987bf4085bd9cfe6ae2712936090c039c84e1206691187e95b"} Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.438160 5036 scope.go:117] "RemoveContainer" containerID="a2dff467e93cff5a309840ea4ff3d585bc1ca22135c13255036ee3acc77ee863" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.449836 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.454657 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5381a5fe-3732-4b03-8bed-b644a2070536-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.457469 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.467734 5036 scope.go:117] "RemoveContainer" containerID="ea7758ab3d0fd620975cdd9808365972c81ed22c5f88ee25d253c928a7a27b0e" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.474620 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:45:41 crc kubenswrapper[5036]: E0110 16:45:41.475075 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5381a5fe-3732-4b03-8bed-b644a2070536" containerName="ceilometer-notification-agent" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.475102 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="5381a5fe-3732-4b03-8bed-b644a2070536" containerName="ceilometer-notification-agent" Jan 10 16:45:41 crc kubenswrapper[5036]: E0110 16:45:41.475135 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5381a5fe-3732-4b03-8bed-b644a2070536" containerName="proxy-httpd" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.475144 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="5381a5fe-3732-4b03-8bed-b644a2070536" containerName="proxy-httpd" Jan 10 16:45:41 crc kubenswrapper[5036]: E0110 16:45:41.475160 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5381a5fe-3732-4b03-8bed-b644a2070536" containerName="sg-core" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.475171 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="5381a5fe-3732-4b03-8bed-b644a2070536" containerName="sg-core" Jan 10 16:45:41 crc kubenswrapper[5036]: E0110 16:45:41.475193 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5381a5fe-3732-4b03-8bed-b644a2070536" containerName="ceilometer-central-agent" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.475201 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="5381a5fe-3732-4b03-8bed-b644a2070536" containerName="ceilometer-central-agent" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.475396 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="5381a5fe-3732-4b03-8bed-b644a2070536" containerName="sg-core" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.475418 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="5381a5fe-3732-4b03-8bed-b644a2070536" containerName="proxy-httpd" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.475430 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="5381a5fe-3732-4b03-8bed-b644a2070536" containerName="ceilometer-central-agent" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.475442 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="5381a5fe-3732-4b03-8bed-b644a2070536" containerName="ceilometer-notification-agent" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.477397 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.479375 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.479580 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.480025 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.485079 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.490190 5036 scope.go:117] "RemoveContainer" containerID="969308f26ac349ae4d016287034725f0a62b7dbe89e355714b4770803890f547" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.508176 5036 scope.go:117] "RemoveContainer" containerID="70c48ef720eecca108c1db3f01cd106e3b12f092f572cb078c0947b7d750743e" Jan 10 16:45:41 crc kubenswrapper[5036]: E0110 16:45:41.508737 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70c48ef720eecca108c1db3f01cd106e3b12f092f572cb078c0947b7d750743e\": container with ID starting with 70c48ef720eecca108c1db3f01cd106e3b12f092f572cb078c0947b7d750743e not found: ID does not exist" containerID="70c48ef720eecca108c1db3f01cd106e3b12f092f572cb078c0947b7d750743e" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.508782 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70c48ef720eecca108c1db3f01cd106e3b12f092f572cb078c0947b7d750743e"} err="failed to get container status \"70c48ef720eecca108c1db3f01cd106e3b12f092f572cb078c0947b7d750743e\": rpc error: code = NotFound desc = could not find container \"70c48ef720eecca108c1db3f01cd106e3b12f092f572cb078c0947b7d750743e\": container with ID starting with 70c48ef720eecca108c1db3f01cd106e3b12f092f572cb078c0947b7d750743e not found: ID does not exist" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.508816 5036 scope.go:117] "RemoveContainer" containerID="a2dff467e93cff5a309840ea4ff3d585bc1ca22135c13255036ee3acc77ee863" Jan 10 16:45:41 crc kubenswrapper[5036]: E0110 16:45:41.509942 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2dff467e93cff5a309840ea4ff3d585bc1ca22135c13255036ee3acc77ee863\": container with ID starting with a2dff467e93cff5a309840ea4ff3d585bc1ca22135c13255036ee3acc77ee863 not found: ID does not exist" containerID="a2dff467e93cff5a309840ea4ff3d585bc1ca22135c13255036ee3acc77ee863" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.510085 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2dff467e93cff5a309840ea4ff3d585bc1ca22135c13255036ee3acc77ee863"} err="failed to get container status \"a2dff467e93cff5a309840ea4ff3d585bc1ca22135c13255036ee3acc77ee863\": rpc error: code = NotFound desc = could not find container \"a2dff467e93cff5a309840ea4ff3d585bc1ca22135c13255036ee3acc77ee863\": container with ID starting with a2dff467e93cff5a309840ea4ff3d585bc1ca22135c13255036ee3acc77ee863 not found: ID does not exist" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.510173 5036 scope.go:117] "RemoveContainer" containerID="ea7758ab3d0fd620975cdd9808365972c81ed22c5f88ee25d253c928a7a27b0e" Jan 10 16:45:41 crc kubenswrapper[5036]: E0110 16:45:41.510985 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea7758ab3d0fd620975cdd9808365972c81ed22c5f88ee25d253c928a7a27b0e\": container with ID starting with ea7758ab3d0fd620975cdd9808365972c81ed22c5f88ee25d253c928a7a27b0e not found: ID does not exist" containerID="ea7758ab3d0fd620975cdd9808365972c81ed22c5f88ee25d253c928a7a27b0e" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.511011 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea7758ab3d0fd620975cdd9808365972c81ed22c5f88ee25d253c928a7a27b0e"} err="failed to get container status \"ea7758ab3d0fd620975cdd9808365972c81ed22c5f88ee25d253c928a7a27b0e\": rpc error: code = NotFound desc = could not find container \"ea7758ab3d0fd620975cdd9808365972c81ed22c5f88ee25d253c928a7a27b0e\": container with ID starting with ea7758ab3d0fd620975cdd9808365972c81ed22c5f88ee25d253c928a7a27b0e not found: ID does not exist" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.511025 5036 scope.go:117] "RemoveContainer" containerID="969308f26ac349ae4d016287034725f0a62b7dbe89e355714b4770803890f547" Jan 10 16:45:41 crc kubenswrapper[5036]: E0110 16:45:41.512103 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"969308f26ac349ae4d016287034725f0a62b7dbe89e355714b4770803890f547\": container with ID starting with 969308f26ac349ae4d016287034725f0a62b7dbe89e355714b4770803890f547 not found: ID does not exist" containerID="969308f26ac349ae4d016287034725f0a62b7dbe89e355714b4770803890f547" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.512189 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"969308f26ac349ae4d016287034725f0a62b7dbe89e355714b4770803890f547"} err="failed to get container status \"969308f26ac349ae4d016287034725f0a62b7dbe89e355714b4770803890f547\": rpc error: code = NotFound desc = could not find container \"969308f26ac349ae4d016287034725f0a62b7dbe89e355714b4770803890f547\": container with ID starting with 969308f26ac349ae4d016287034725f0a62b7dbe89e355714b4770803890f547 not found: ID does not exist" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.658323 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-scripts\") pod \"ceilometer-0\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.658889 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-config-data\") pod \"ceilometer-0\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.659010 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.659101 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-run-httpd\") pod \"ceilometer-0\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.659201 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzqd4\" (UniqueName: \"kubernetes.io/projected/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-kube-api-access-jzqd4\") pod \"ceilometer-0\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.659353 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.659483 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.659592 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-log-httpd\") pod \"ceilometer-0\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.761664 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-scripts\") pod \"ceilometer-0\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.763242 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-config-data\") pod \"ceilometer-0\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.763358 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.763490 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-run-httpd\") pod \"ceilometer-0\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.763590 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzqd4\" (UniqueName: \"kubernetes.io/projected/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-kube-api-access-jzqd4\") pod \"ceilometer-0\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.763732 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.763863 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.763984 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-run-httpd\") pod \"ceilometer-0\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.764445 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-log-httpd\") pod \"ceilometer-0\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.764594 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-log-httpd\") pod \"ceilometer-0\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.767235 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.767532 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-config-data\") pod \"ceilometer-0\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.767738 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-scripts\") pod \"ceilometer-0\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.768413 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.768926 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.784165 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzqd4\" (UniqueName: \"kubernetes.io/projected/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-kube-api-access-jzqd4\") pod \"ceilometer-0\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " pod="openstack/ceilometer-0" Jan 10 16:45:41 crc kubenswrapper[5036]: I0110 16:45:41.798078 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 16:45:42 crc kubenswrapper[5036]: I0110 16:45:42.310863 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:45:42 crc kubenswrapper[5036]: W0110 16:45:42.325786 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf25a6fbd_9d05_4ccd_bcf0_0e2569e5210c.slice/crio-89e73c32069fbf5ce31ac5f27c17d9215b6db3e38a0c4c4eb2d821269249d566 WatchSource:0}: Error finding container 89e73c32069fbf5ce31ac5f27c17d9215b6db3e38a0c4c4eb2d821269249d566: Status 404 returned error can't find the container with id 89e73c32069fbf5ce31ac5f27c17d9215b6db3e38a0c4c4eb2d821269249d566 Jan 10 16:45:42 crc kubenswrapper[5036]: I0110 16:45:42.432539 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c","Type":"ContainerStarted","Data":"89e73c32069fbf5ce31ac5f27c17d9215b6db3e38a0c4c4eb2d821269249d566"} Jan 10 16:45:42 crc kubenswrapper[5036]: I0110 16:45:42.432724 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 10 16:45:42 crc kubenswrapper[5036]: I0110 16:45:42.453364 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.453348027 podStartE2EDuration="2.453348027s" podCreationTimestamp="2026-01-10 16:45:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:45:42.449695053 +0000 UTC m=+1064.319930547" watchObservedRunningTime="2026-01-10 16:45:42.453348027 +0000 UTC m=+1064.323583521" Jan 10 16:45:42 crc kubenswrapper[5036]: I0110 16:45:42.516591 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5381a5fe-3732-4b03-8bed-b644a2070536" path="/var/lib/kubelet/pods/5381a5fe-3732-4b03-8bed-b644a2070536/volumes" Jan 10 16:45:44 crc kubenswrapper[5036]: I0110 16:45:44.450041 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c","Type":"ContainerStarted","Data":"58d0d3001d503c3b44d42727773790d14733ce6cc7cc20b1773f78ec18742634"} Jan 10 16:45:45 crc kubenswrapper[5036]: I0110 16:45:45.464552 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c","Type":"ContainerStarted","Data":"63b13ef439ca6c62bf2244c701b81a82dfaf0673f9e6fb779cab6e45955aefc5"} Jan 10 16:45:46 crc kubenswrapper[5036]: I0110 16:45:46.472447 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c","Type":"ContainerStarted","Data":"ab6be45333fa9739aaae8dace5f8632f42666c9399daa665f72448f2ddde31de"} Jan 10 16:45:48 crc kubenswrapper[5036]: I0110 16:45:48.496483 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c","Type":"ContainerStarted","Data":"5a0380421d3f6630d8ecd24205055565cedff466560216642627b3358082472e"} Jan 10 16:45:48 crc kubenswrapper[5036]: I0110 16:45:48.497031 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 10 16:45:48 crc kubenswrapper[5036]: I0110 16:45:48.542764 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.367174983 podStartE2EDuration="7.542747095s" podCreationTimestamp="2026-01-10 16:45:41 +0000 UTC" firstStartedPulling="2026-01-10 16:45:42.328381129 +0000 UTC m=+1064.198616623" lastFinishedPulling="2026-01-10 16:45:47.503953221 +0000 UTC m=+1069.374188735" observedRunningTime="2026-01-10 16:45:48.537181466 +0000 UTC m=+1070.407416990" watchObservedRunningTime="2026-01-10 16:45:48.542747095 +0000 UTC m=+1070.412982589" Jan 10 16:45:50 crc kubenswrapper[5036]: I0110 16:45:50.920474 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.402958 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-lzdzf"] Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.404921 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lzdzf" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.407942 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.408307 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.413530 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-lzdzf"] Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.553492 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc4vm\" (UniqueName: \"kubernetes.io/projected/4596a8b1-1c76-48fd-8c48-ae9adb6f629e-kube-api-access-zc4vm\") pod \"nova-cell0-cell-mapping-lzdzf\" (UID: \"4596a8b1-1c76-48fd-8c48-ae9adb6f629e\") " pod="openstack/nova-cell0-cell-mapping-lzdzf" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.553564 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4596a8b1-1c76-48fd-8c48-ae9adb6f629e-scripts\") pod \"nova-cell0-cell-mapping-lzdzf\" (UID: \"4596a8b1-1c76-48fd-8c48-ae9adb6f629e\") " pod="openstack/nova-cell0-cell-mapping-lzdzf" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.553594 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4596a8b1-1c76-48fd-8c48-ae9adb6f629e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lzdzf\" (UID: \"4596a8b1-1c76-48fd-8c48-ae9adb6f629e\") " pod="openstack/nova-cell0-cell-mapping-lzdzf" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.553805 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4596a8b1-1c76-48fd-8c48-ae9adb6f629e-config-data\") pod \"nova-cell0-cell-mapping-lzdzf\" (UID: \"4596a8b1-1c76-48fd-8c48-ae9adb6f629e\") " pod="openstack/nova-cell0-cell-mapping-lzdzf" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.612696 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.614257 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.621433 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.622786 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.635196 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.638311 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.638945 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.645907 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.654932 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4596a8b1-1c76-48fd-8c48-ae9adb6f629e-config-data\") pod \"nova-cell0-cell-mapping-lzdzf\" (UID: \"4596a8b1-1c76-48fd-8c48-ae9adb6f629e\") " pod="openstack/nova-cell0-cell-mapping-lzdzf" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.655076 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc4vm\" (UniqueName: \"kubernetes.io/projected/4596a8b1-1c76-48fd-8c48-ae9adb6f629e-kube-api-access-zc4vm\") pod \"nova-cell0-cell-mapping-lzdzf\" (UID: \"4596a8b1-1c76-48fd-8c48-ae9adb6f629e\") " pod="openstack/nova-cell0-cell-mapping-lzdzf" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.655123 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4596a8b1-1c76-48fd-8c48-ae9adb6f629e-scripts\") pod \"nova-cell0-cell-mapping-lzdzf\" (UID: \"4596a8b1-1c76-48fd-8c48-ae9adb6f629e\") " pod="openstack/nova-cell0-cell-mapping-lzdzf" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.655158 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4596a8b1-1c76-48fd-8c48-ae9adb6f629e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lzdzf\" (UID: \"4596a8b1-1c76-48fd-8c48-ae9adb6f629e\") " pod="openstack/nova-cell0-cell-mapping-lzdzf" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.665603 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4596a8b1-1c76-48fd-8c48-ae9adb6f629e-config-data\") pod \"nova-cell0-cell-mapping-lzdzf\" (UID: \"4596a8b1-1c76-48fd-8c48-ae9adb6f629e\") " pod="openstack/nova-cell0-cell-mapping-lzdzf" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.668354 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4596a8b1-1c76-48fd-8c48-ae9adb6f629e-scripts\") pod \"nova-cell0-cell-mapping-lzdzf\" (UID: \"4596a8b1-1c76-48fd-8c48-ae9adb6f629e\") " pod="openstack/nova-cell0-cell-mapping-lzdzf" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.668814 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4596a8b1-1c76-48fd-8c48-ae9adb6f629e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lzdzf\" (UID: \"4596a8b1-1c76-48fd-8c48-ae9adb6f629e\") " pod="openstack/nova-cell0-cell-mapping-lzdzf" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.719239 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc4vm\" (UniqueName: \"kubernetes.io/projected/4596a8b1-1c76-48fd-8c48-ae9adb6f629e-kube-api-access-zc4vm\") pod \"nova-cell0-cell-mapping-lzdzf\" (UID: \"4596a8b1-1c76-48fd-8c48-ae9adb6f629e\") " pod="openstack/nova-cell0-cell-mapping-lzdzf" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.725913 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lzdzf" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.758275 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.761486 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.761639 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqdmj\" (UniqueName: \"kubernetes.io/projected/3dfc89f0-28a3-482f-8e8a-80d23e14c53a-kube-api-access-fqdmj\") pod \"nova-api-0\" (UID: \"3dfc89f0-28a3-482f-8e8a-80d23e14c53a\") " pod="openstack/nova-api-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.761715 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dfc89f0-28a3-482f-8e8a-80d23e14c53a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3dfc89f0-28a3-482f-8e8a-80d23e14c53a\") " pod="openstack/nova-api-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.761750 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dfc89f0-28a3-482f-8e8a-80d23e14c53a-logs\") pod \"nova-api-0\" (UID: \"3dfc89f0-28a3-482f-8e8a-80d23e14c53a\") " pod="openstack/nova-api-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.761773 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dfc89f0-28a3-482f-8e8a-80d23e14c53a-config-data\") pod \"nova-api-0\" (UID: \"3dfc89f0-28a3-482f-8e8a-80d23e14c53a\") " pod="openstack/nova-api-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.761838 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/612fc345-2e6f-43c5-bfe1-e605b58e1159-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"612fc345-2e6f-43c5-bfe1-e605b58e1159\") " pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.761853 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zctqr\" (UniqueName: \"kubernetes.io/projected/612fc345-2e6f-43c5-bfe1-e605b58e1159-kube-api-access-zctqr\") pod \"nova-cell1-novncproxy-0\" (UID: \"612fc345-2e6f-43c5-bfe1-e605b58e1159\") " pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.761886 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/612fc345-2e6f-43c5-bfe1-e605b58e1159-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"612fc345-2e6f-43c5-bfe1-e605b58e1159\") " pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.773965 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.805142 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.810287 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.811747 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.846465 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.867967 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/612fc345-2e6f-43c5-bfe1-e605b58e1159-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"612fc345-2e6f-43c5-bfe1-e605b58e1159\") " pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.868020 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3264c467-4991-43e7-b6e4-c69ec14d37b9-config-data\") pod \"nova-metadata-0\" (UID: \"3264c467-4991-43e7-b6e4-c69ec14d37b9\") " pod="openstack/nova-metadata-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.868080 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-826kc\" (UniqueName: \"kubernetes.io/projected/3264c467-4991-43e7-b6e4-c69ec14d37b9-kube-api-access-826kc\") pod \"nova-metadata-0\" (UID: \"3264c467-4991-43e7-b6e4-c69ec14d37b9\") " pod="openstack/nova-metadata-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.868114 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqdmj\" (UniqueName: \"kubernetes.io/projected/3dfc89f0-28a3-482f-8e8a-80d23e14c53a-kube-api-access-fqdmj\") pod \"nova-api-0\" (UID: \"3dfc89f0-28a3-482f-8e8a-80d23e14c53a\") " pod="openstack/nova-api-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.868148 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dfc89f0-28a3-482f-8e8a-80d23e14c53a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3dfc89f0-28a3-482f-8e8a-80d23e14c53a\") " pod="openstack/nova-api-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.868175 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dfc89f0-28a3-482f-8e8a-80d23e14c53a-logs\") pod \"nova-api-0\" (UID: \"3dfc89f0-28a3-482f-8e8a-80d23e14c53a\") " pod="openstack/nova-api-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.868194 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dfc89f0-28a3-482f-8e8a-80d23e14c53a-config-data\") pod \"nova-api-0\" (UID: \"3dfc89f0-28a3-482f-8e8a-80d23e14c53a\") " pod="openstack/nova-api-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.868215 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/612fc345-2e6f-43c5-bfe1-e605b58e1159-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"612fc345-2e6f-43c5-bfe1-e605b58e1159\") " pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.868231 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zctqr\" (UniqueName: \"kubernetes.io/projected/612fc345-2e6f-43c5-bfe1-e605b58e1159-kube-api-access-zctqr\") pod \"nova-cell1-novncproxy-0\" (UID: \"612fc345-2e6f-43c5-bfe1-e605b58e1159\") " pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.868251 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3264c467-4991-43e7-b6e4-c69ec14d37b9-logs\") pod \"nova-metadata-0\" (UID: \"3264c467-4991-43e7-b6e4-c69ec14d37b9\") " pod="openstack/nova-metadata-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.868269 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3264c467-4991-43e7-b6e4-c69ec14d37b9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3264c467-4991-43e7-b6e4-c69ec14d37b9\") " pod="openstack/nova-metadata-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.873530 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dfc89f0-28a3-482f-8e8a-80d23e14c53a-logs\") pod \"nova-api-0\" (UID: \"3dfc89f0-28a3-482f-8e8a-80d23e14c53a\") " pod="openstack/nova-api-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.877309 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/612fc345-2e6f-43c5-bfe1-e605b58e1159-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"612fc345-2e6f-43c5-bfe1-e605b58e1159\") " pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.879350 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dfc89f0-28a3-482f-8e8a-80d23e14c53a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3dfc89f0-28a3-482f-8e8a-80d23e14c53a\") " pod="openstack/nova-api-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.890301 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.893382 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dfc89f0-28a3-482f-8e8a-80d23e14c53a-config-data\") pod \"nova-api-0\" (UID: \"3dfc89f0-28a3-482f-8e8a-80d23e14c53a\") " pod="openstack/nova-api-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.902483 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/612fc345-2e6f-43c5-bfe1-e605b58e1159-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"612fc345-2e6f-43c5-bfe1-e605b58e1159\") " pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.924151 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqdmj\" (UniqueName: \"kubernetes.io/projected/3dfc89f0-28a3-482f-8e8a-80d23e14c53a-kube-api-access-fqdmj\") pod \"nova-api-0\" (UID: \"3dfc89f0-28a3-482f-8e8a-80d23e14c53a\") " pod="openstack/nova-api-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.932863 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.940760 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-q9t6n"] Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.941526 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zctqr\" (UniqueName: \"kubernetes.io/projected/612fc345-2e6f-43c5-bfe1-e605b58e1159-kube-api-access-zctqr\") pod \"nova-cell1-novncproxy-0\" (UID: \"612fc345-2e6f-43c5-bfe1-e605b58e1159\") " pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.942761 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.944476 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.992691 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3264c467-4991-43e7-b6e4-c69ec14d37b9-logs\") pod \"nova-metadata-0\" (UID: \"3264c467-4991-43e7-b6e4-c69ec14d37b9\") " pod="openstack/nova-metadata-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.992721 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-q9t6n"] Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.992727 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3264c467-4991-43e7-b6e4-c69ec14d37b9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3264c467-4991-43e7-b6e4-c69ec14d37b9\") " pod="openstack/nova-metadata-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.992781 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3264c467-4991-43e7-b6e4-c69ec14d37b9-config-data\") pod \"nova-metadata-0\" (UID: \"3264c467-4991-43e7-b6e4-c69ec14d37b9\") " pod="openstack/nova-metadata-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.992842 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-826kc\" (UniqueName: \"kubernetes.io/projected/3264c467-4991-43e7-b6e4-c69ec14d37b9-kube-api-access-826kc\") pod \"nova-metadata-0\" (UID: \"3264c467-4991-43e7-b6e4-c69ec14d37b9\") " pod="openstack/nova-metadata-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.992873 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04f0748c-4f3c-4636-88ca-f158e22015b2-config-data\") pod \"nova-scheduler-0\" (UID: \"04f0748c-4f3c-4636-88ca-f158e22015b2\") " pod="openstack/nova-scheduler-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.992896 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt7z5\" (UniqueName: \"kubernetes.io/projected/04f0748c-4f3c-4636-88ca-f158e22015b2-kube-api-access-kt7z5\") pod \"nova-scheduler-0\" (UID: \"04f0748c-4f3c-4636-88ca-f158e22015b2\") " pod="openstack/nova-scheduler-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.992919 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04f0748c-4f3c-4636-88ca-f158e22015b2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"04f0748c-4f3c-4636-88ca-f158e22015b2\") " pod="openstack/nova-scheduler-0" Jan 10 16:45:51 crc kubenswrapper[5036]: I0110 16:45:51.993058 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3264c467-4991-43e7-b6e4-c69ec14d37b9-logs\") pod \"nova-metadata-0\" (UID: \"3264c467-4991-43e7-b6e4-c69ec14d37b9\") " pod="openstack/nova-metadata-0" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.012952 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3264c467-4991-43e7-b6e4-c69ec14d37b9-config-data\") pod \"nova-metadata-0\" (UID: \"3264c467-4991-43e7-b6e4-c69ec14d37b9\") " pod="openstack/nova-metadata-0" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.014118 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-826kc\" (UniqueName: \"kubernetes.io/projected/3264c467-4991-43e7-b6e4-c69ec14d37b9-kube-api-access-826kc\") pod \"nova-metadata-0\" (UID: \"3264c467-4991-43e7-b6e4-c69ec14d37b9\") " pod="openstack/nova-metadata-0" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.016442 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3264c467-4991-43e7-b6e4-c69ec14d37b9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3264c467-4991-43e7-b6e4-c69ec14d37b9\") " pod="openstack/nova-metadata-0" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.094176 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbbe955a-7e12-4ed0-a795-4f182840d5e2-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-q9t6n\" (UID: \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\") " pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.094564 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwffv\" (UniqueName: \"kubernetes.io/projected/cbbe955a-7e12-4ed0-a795-4f182840d5e2-kube-api-access-kwffv\") pod \"dnsmasq-dns-566b5b7845-q9t6n\" (UID: \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\") " pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.094669 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbbe955a-7e12-4ed0-a795-4f182840d5e2-config\") pod \"dnsmasq-dns-566b5b7845-q9t6n\" (UID: \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\") " pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.094720 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04f0748c-4f3c-4636-88ca-f158e22015b2-config-data\") pod \"nova-scheduler-0\" (UID: \"04f0748c-4f3c-4636-88ca-f158e22015b2\") " pod="openstack/nova-scheduler-0" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.094759 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt7z5\" (UniqueName: \"kubernetes.io/projected/04f0748c-4f3c-4636-88ca-f158e22015b2-kube-api-access-kt7z5\") pod \"nova-scheduler-0\" (UID: \"04f0748c-4f3c-4636-88ca-f158e22015b2\") " pod="openstack/nova-scheduler-0" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.094791 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04f0748c-4f3c-4636-88ca-f158e22015b2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"04f0748c-4f3c-4636-88ca-f158e22015b2\") " pod="openstack/nova-scheduler-0" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.094889 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbbe955a-7e12-4ed0-a795-4f182840d5e2-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-q9t6n\" (UID: \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\") " pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.094932 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbbe955a-7e12-4ed0-a795-4f182840d5e2-dns-svc\") pod \"dnsmasq-dns-566b5b7845-q9t6n\" (UID: \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\") " pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.099714 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04f0748c-4f3c-4636-88ca-f158e22015b2-config-data\") pod \"nova-scheduler-0\" (UID: \"04f0748c-4f3c-4636-88ca-f158e22015b2\") " pod="openstack/nova-scheduler-0" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.104522 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04f0748c-4f3c-4636-88ca-f158e22015b2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"04f0748c-4f3c-4636-88ca-f158e22015b2\") " pod="openstack/nova-scheduler-0" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.118023 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt7z5\" (UniqueName: \"kubernetes.io/projected/04f0748c-4f3c-4636-88ca-f158e22015b2-kube-api-access-kt7z5\") pod \"nova-scheduler-0\" (UID: \"04f0748c-4f3c-4636-88ca-f158e22015b2\") " pod="openstack/nova-scheduler-0" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.147932 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.162051 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.206341 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbbe955a-7e12-4ed0-a795-4f182840d5e2-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-q9t6n\" (UID: \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\") " pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.206410 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbbe955a-7e12-4ed0-a795-4f182840d5e2-dns-svc\") pod \"dnsmasq-dns-566b5b7845-q9t6n\" (UID: \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\") " pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.206469 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbbe955a-7e12-4ed0-a795-4f182840d5e2-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-q9t6n\" (UID: \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\") " pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.206497 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwffv\" (UniqueName: \"kubernetes.io/projected/cbbe955a-7e12-4ed0-a795-4f182840d5e2-kube-api-access-kwffv\") pod \"dnsmasq-dns-566b5b7845-q9t6n\" (UID: \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\") " pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.206562 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbbe955a-7e12-4ed0-a795-4f182840d5e2-config\") pod \"dnsmasq-dns-566b5b7845-q9t6n\" (UID: \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\") " pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.207583 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbbe955a-7e12-4ed0-a795-4f182840d5e2-config\") pod \"dnsmasq-dns-566b5b7845-q9t6n\" (UID: \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\") " pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.208199 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbbe955a-7e12-4ed0-a795-4f182840d5e2-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-q9t6n\" (UID: \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\") " pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.208895 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbbe955a-7e12-4ed0-a795-4f182840d5e2-dns-svc\") pod \"dnsmasq-dns-566b5b7845-q9t6n\" (UID: \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\") " pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.209445 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbbe955a-7e12-4ed0-a795-4f182840d5e2-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-q9t6n\" (UID: \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\") " pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.239656 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwffv\" (UniqueName: \"kubernetes.io/projected/cbbe955a-7e12-4ed0-a795-4f182840d5e2-kube-api-access-kwffv\") pod \"dnsmasq-dns-566b5b7845-q9t6n\" (UID: \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\") " pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.311909 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.340370 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-lzdzf"] Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.560196 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lzdzf" event={"ID":"4596a8b1-1c76-48fd-8c48-ae9adb6f629e","Type":"ContainerStarted","Data":"8961553af9f964fb86eb5ffeed6b193c936afa7bc789bfe4b54fac72d542ae54"} Jan 10 16:45:52 crc kubenswrapper[5036]: W0110 16:45:52.564425 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod612fc345_2e6f_43c5_bfe1_e605b58e1159.slice/crio-020ec95e3e1c00acc34f2809cdb4cb985b639ec08e9d7076eb6c63337631d9f1 WatchSource:0}: Error finding container 020ec95e3e1c00acc34f2809cdb4cb985b639ec08e9d7076eb6c63337631d9f1: Status 404 returned error can't find the container with id 020ec95e3e1c00acc34f2809cdb4cb985b639ec08e9d7076eb6c63337631d9f1 Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.573058 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.638003 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 10 16:45:52 crc kubenswrapper[5036]: W0110 16:45:52.645059 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dfc89f0_28a3_482f_8e8a_80d23e14c53a.slice/crio-ecd587469539a09031cea2c8d1e793cb56f798f9a2e503b6127bf8ce168f4490 WatchSource:0}: Error finding container ecd587469539a09031cea2c8d1e793cb56f798f9a2e503b6127bf8ce168f4490: Status 404 returned error can't find the container with id ecd587469539a09031cea2c8d1e793cb56f798f9a2e503b6127bf8ce168f4490 Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.723795 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-282lj"] Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.727073 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-282lj" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.732177 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.732416 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.734160 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-282lj"] Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.817527 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d1109f9-6187-4b88-bb21-c43f2b25b4ad-config-data\") pod \"nova-cell1-conductor-db-sync-282lj\" (UID: \"7d1109f9-6187-4b88-bb21-c43f2b25b4ad\") " pod="openstack/nova-cell1-conductor-db-sync-282lj" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.817736 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d1109f9-6187-4b88-bb21-c43f2b25b4ad-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-282lj\" (UID: \"7d1109f9-6187-4b88-bb21-c43f2b25b4ad\") " pod="openstack/nova-cell1-conductor-db-sync-282lj" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.817812 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d1109f9-6187-4b88-bb21-c43f2b25b4ad-scripts\") pod \"nova-cell1-conductor-db-sync-282lj\" (UID: \"7d1109f9-6187-4b88-bb21-c43f2b25b4ad\") " pod="openstack/nova-cell1-conductor-db-sync-282lj" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.818226 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz2jq\" (UniqueName: \"kubernetes.io/projected/7d1109f9-6187-4b88-bb21-c43f2b25b4ad-kube-api-access-xz2jq\") pod \"nova-cell1-conductor-db-sync-282lj\" (UID: \"7d1109f9-6187-4b88-bb21-c43f2b25b4ad\") " pod="openstack/nova-cell1-conductor-db-sync-282lj" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.845000 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.914084 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 10 16:45:52 crc kubenswrapper[5036]: W0110 16:45:52.918084 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04f0748c_4f3c_4636_88ca_f158e22015b2.slice/crio-1423506609f736729b6a197474014e731984e7d0ae9d79af89e66e4e0e15d814 WatchSource:0}: Error finding container 1423506609f736729b6a197474014e731984e7d0ae9d79af89e66e4e0e15d814: Status 404 returned error can't find the container with id 1423506609f736729b6a197474014e731984e7d0ae9d79af89e66e4e0e15d814 Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.920582 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz2jq\" (UniqueName: \"kubernetes.io/projected/7d1109f9-6187-4b88-bb21-c43f2b25b4ad-kube-api-access-xz2jq\") pod \"nova-cell1-conductor-db-sync-282lj\" (UID: \"7d1109f9-6187-4b88-bb21-c43f2b25b4ad\") " pod="openstack/nova-cell1-conductor-db-sync-282lj" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.920654 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d1109f9-6187-4b88-bb21-c43f2b25b4ad-config-data\") pod \"nova-cell1-conductor-db-sync-282lj\" (UID: \"7d1109f9-6187-4b88-bb21-c43f2b25b4ad\") " pod="openstack/nova-cell1-conductor-db-sync-282lj" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.921740 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d1109f9-6187-4b88-bb21-c43f2b25b4ad-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-282lj\" (UID: \"7d1109f9-6187-4b88-bb21-c43f2b25b4ad\") " pod="openstack/nova-cell1-conductor-db-sync-282lj" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.921791 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d1109f9-6187-4b88-bb21-c43f2b25b4ad-scripts\") pod \"nova-cell1-conductor-db-sync-282lj\" (UID: \"7d1109f9-6187-4b88-bb21-c43f2b25b4ad\") " pod="openstack/nova-cell1-conductor-db-sync-282lj" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.926708 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d1109f9-6187-4b88-bb21-c43f2b25b4ad-scripts\") pod \"nova-cell1-conductor-db-sync-282lj\" (UID: \"7d1109f9-6187-4b88-bb21-c43f2b25b4ad\") " pod="openstack/nova-cell1-conductor-db-sync-282lj" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.927955 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d1109f9-6187-4b88-bb21-c43f2b25b4ad-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-282lj\" (UID: \"7d1109f9-6187-4b88-bb21-c43f2b25b4ad\") " pod="openstack/nova-cell1-conductor-db-sync-282lj" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.928576 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d1109f9-6187-4b88-bb21-c43f2b25b4ad-config-data\") pod \"nova-cell1-conductor-db-sync-282lj\" (UID: \"7d1109f9-6187-4b88-bb21-c43f2b25b4ad\") " pod="openstack/nova-cell1-conductor-db-sync-282lj" Jan 10 16:45:52 crc kubenswrapper[5036]: I0110 16:45:52.937153 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz2jq\" (UniqueName: \"kubernetes.io/projected/7d1109f9-6187-4b88-bb21-c43f2b25b4ad-kube-api-access-xz2jq\") pod \"nova-cell1-conductor-db-sync-282lj\" (UID: \"7d1109f9-6187-4b88-bb21-c43f2b25b4ad\") " pod="openstack/nova-cell1-conductor-db-sync-282lj" Jan 10 16:45:53 crc kubenswrapper[5036]: W0110 16:45:53.027441 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbbe955a_7e12_4ed0_a795_4f182840d5e2.slice/crio-ec414d17de08af97d1a50e640090c5c04f2bff5800e2194d4e480087d7a953bb WatchSource:0}: Error finding container ec414d17de08af97d1a50e640090c5c04f2bff5800e2194d4e480087d7a953bb: Status 404 returned error can't find the container with id ec414d17de08af97d1a50e640090c5c04f2bff5800e2194d4e480087d7a953bb Jan 10 16:45:53 crc kubenswrapper[5036]: I0110 16:45:53.027824 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-q9t6n"] Jan 10 16:45:53 crc kubenswrapper[5036]: I0110 16:45:53.059791 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-282lj" Jan 10 16:45:53 crc kubenswrapper[5036]: I0110 16:45:53.613487 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3264c467-4991-43e7-b6e4-c69ec14d37b9","Type":"ContainerStarted","Data":"34fe13aa7a0dc4c70a37e636ad03b672e0c4fc33d3ec73a9bfd0ef49086ffde3"} Jan 10 16:45:53 crc kubenswrapper[5036]: I0110 16:45:53.619426 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"612fc345-2e6f-43c5-bfe1-e605b58e1159","Type":"ContainerStarted","Data":"020ec95e3e1c00acc34f2809cdb4cb985b639ec08e9d7076eb6c63337631d9f1"} Jan 10 16:45:53 crc kubenswrapper[5036]: I0110 16:45:53.620762 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04f0748c-4f3c-4636-88ca-f158e22015b2","Type":"ContainerStarted","Data":"1423506609f736729b6a197474014e731984e7d0ae9d79af89e66e4e0e15d814"} Jan 10 16:45:53 crc kubenswrapper[5036]: I0110 16:45:53.626267 5036 generic.go:334] "Generic (PLEG): container finished" podID="cbbe955a-7e12-4ed0-a795-4f182840d5e2" containerID="99e6a6cd944efa6b4ca679b555841f1f50d263bf17e301f621bd91bd7bb7918d" exitCode=0 Jan 10 16:45:53 crc kubenswrapper[5036]: I0110 16:45:53.626307 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" event={"ID":"cbbe955a-7e12-4ed0-a795-4f182840d5e2","Type":"ContainerDied","Data":"99e6a6cd944efa6b4ca679b555841f1f50d263bf17e301f621bd91bd7bb7918d"} Jan 10 16:45:53 crc kubenswrapper[5036]: I0110 16:45:53.626341 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" event={"ID":"cbbe955a-7e12-4ed0-a795-4f182840d5e2","Type":"ContainerStarted","Data":"ec414d17de08af97d1a50e640090c5c04f2bff5800e2194d4e480087d7a953bb"} Jan 10 16:45:53 crc kubenswrapper[5036]: I0110 16:45:53.632673 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lzdzf" event={"ID":"4596a8b1-1c76-48fd-8c48-ae9adb6f629e","Type":"ContainerStarted","Data":"a46af9221f40c5ca7a6f8ac24fb548026064eee9253e5e0c792c3663486e9aa2"} Jan 10 16:45:53 crc kubenswrapper[5036]: I0110 16:45:53.637805 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3dfc89f0-28a3-482f-8e8a-80d23e14c53a","Type":"ContainerStarted","Data":"ecd587469539a09031cea2c8d1e793cb56f798f9a2e503b6127bf8ce168f4490"} Jan 10 16:45:53 crc kubenswrapper[5036]: I0110 16:45:53.651414 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-282lj"] Jan 10 16:45:53 crc kubenswrapper[5036]: W0110 16:45:53.667727 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d1109f9_6187_4b88_bb21_c43f2b25b4ad.slice/crio-d2961e94159758729383c8d00e11e6cb5f935da214ede77b69eac364420b4c67 WatchSource:0}: Error finding container d2961e94159758729383c8d00e11e6cb5f935da214ede77b69eac364420b4c67: Status 404 returned error can't find the container with id d2961e94159758729383c8d00e11e6cb5f935da214ede77b69eac364420b4c67 Jan 10 16:45:53 crc kubenswrapper[5036]: I0110 16:45:53.668963 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-lzdzf" podStartSLOduration=2.668940886 podStartE2EDuration="2.668940886s" podCreationTimestamp="2026-01-10 16:45:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:45:53.662565224 +0000 UTC m=+1075.532800718" watchObservedRunningTime="2026-01-10 16:45:53.668940886 +0000 UTC m=+1075.539176380" Jan 10 16:45:54 crc kubenswrapper[5036]: I0110 16:45:54.654161 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-282lj" event={"ID":"7d1109f9-6187-4b88-bb21-c43f2b25b4ad","Type":"ContainerStarted","Data":"d2961e94159758729383c8d00e11e6cb5f935da214ede77b69eac364420b4c67"} Jan 10 16:45:55 crc kubenswrapper[5036]: I0110 16:45:55.503108 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 10 16:45:55 crc kubenswrapper[5036]: I0110 16:45:55.515822 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 10 16:45:56 crc kubenswrapper[5036]: I0110 16:45:56.677393 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3264c467-4991-43e7-b6e4-c69ec14d37b9","Type":"ContainerStarted","Data":"bd8424647110934d8f01eec2e56932f6b23b10a4b471ed43041ed4bda3211301"} Jan 10 16:45:56 crc kubenswrapper[5036]: I0110 16:45:56.677909 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3264c467-4991-43e7-b6e4-c69ec14d37b9","Type":"ContainerStarted","Data":"49167cf1212ddc8304d8045d25926038e11bc0b9420c18ef48038f1ad6a7bc53"} Jan 10 16:45:56 crc kubenswrapper[5036]: I0110 16:45:56.678022 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3264c467-4991-43e7-b6e4-c69ec14d37b9" containerName="nova-metadata-log" containerID="cri-o://49167cf1212ddc8304d8045d25926038e11bc0b9420c18ef48038f1ad6a7bc53" gracePeriod=30 Jan 10 16:45:56 crc kubenswrapper[5036]: I0110 16:45:56.678285 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3264c467-4991-43e7-b6e4-c69ec14d37b9" containerName="nova-metadata-metadata" containerID="cri-o://bd8424647110934d8f01eec2e56932f6b23b10a4b471ed43041ed4bda3211301" gracePeriod=30 Jan 10 16:45:56 crc kubenswrapper[5036]: I0110 16:45:56.682573 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"612fc345-2e6f-43c5-bfe1-e605b58e1159","Type":"ContainerStarted","Data":"062f4cc5c2f78ab1962c592ee9b9dd067adf2b9f1d9c840c347c47ba459c70a3"} Jan 10 16:45:56 crc kubenswrapper[5036]: I0110 16:45:56.682672 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="612fc345-2e6f-43c5-bfe1-e605b58e1159" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://062f4cc5c2f78ab1962c592ee9b9dd067adf2b9f1d9c840c347c47ba459c70a3" gracePeriod=30 Jan 10 16:45:56 crc kubenswrapper[5036]: I0110 16:45:56.685539 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04f0748c-4f3c-4636-88ca-f158e22015b2","Type":"ContainerStarted","Data":"f0b1454a877cbcbc763fdc0cd07a7831c8b5ebfc2b424f9fb799863628ef47d6"} Jan 10 16:45:56 crc kubenswrapper[5036]: I0110 16:45:56.692863 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" event={"ID":"cbbe955a-7e12-4ed0-a795-4f182840d5e2","Type":"ContainerStarted","Data":"acb2bdc31b10187a0578b148acc28dcee1334599feeb326394dcd30113661bee"} Jan 10 16:45:56 crc kubenswrapper[5036]: I0110 16:45:56.693218 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" Jan 10 16:45:56 crc kubenswrapper[5036]: I0110 16:45:56.694356 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-282lj" event={"ID":"7d1109f9-6187-4b88-bb21-c43f2b25b4ad","Type":"ContainerStarted","Data":"fa1da74453138d18273be89397bb33897347520e37fcd326e2f85eaf96f7237c"} Jan 10 16:45:56 crc kubenswrapper[5036]: I0110 16:45:56.696281 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3dfc89f0-28a3-482f-8e8a-80d23e14c53a","Type":"ContainerStarted","Data":"fe2924cdc97f368f7a4a4d0b1260805c4b83a480f39f57798b5799b9fc70a499"} Jan 10 16:45:56 crc kubenswrapper[5036]: I0110 16:45:56.696313 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3dfc89f0-28a3-482f-8e8a-80d23e14c53a","Type":"ContainerStarted","Data":"edd5c154470e69955a20788b278cb256aa5c00b3ffc4be2cd299a8a7f9a489dc"} Jan 10 16:45:56 crc kubenswrapper[5036]: I0110 16:45:56.712232 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.748663723 podStartE2EDuration="5.712217179s" podCreationTimestamp="2026-01-10 16:45:51 +0000 UTC" firstStartedPulling="2026-01-10 16:45:52.861608583 +0000 UTC m=+1074.731844077" lastFinishedPulling="2026-01-10 16:45:55.825162039 +0000 UTC m=+1077.695397533" observedRunningTime="2026-01-10 16:45:56.707143054 +0000 UTC m=+1078.577378548" watchObservedRunningTime="2026-01-10 16:45:56.712217179 +0000 UTC m=+1078.582452673" Jan 10 16:45:56 crc kubenswrapper[5036]: I0110 16:45:56.754398 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.560448709 podStartE2EDuration="5.754380203s" podCreationTimestamp="2026-01-10 16:45:51 +0000 UTC" firstStartedPulling="2026-01-10 16:45:52.567442183 +0000 UTC m=+1074.437677667" lastFinishedPulling="2026-01-10 16:45:55.761373667 +0000 UTC m=+1077.631609161" observedRunningTime="2026-01-10 16:45:56.753139807 +0000 UTC m=+1078.623375301" watchObservedRunningTime="2026-01-10 16:45:56.754380203 +0000 UTC m=+1078.624615697" Jan 10 16:45:56 crc kubenswrapper[5036]: I0110 16:45:56.778648 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" podStartSLOduration=5.778631395 podStartE2EDuration="5.778631395s" podCreationTimestamp="2026-01-10 16:45:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:45:56.773841338 +0000 UTC m=+1078.644076832" watchObservedRunningTime="2026-01-10 16:45:56.778631395 +0000 UTC m=+1078.648866889" Jan 10 16:45:56 crc kubenswrapper[5036]: I0110 16:45:56.798156 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-282lj" podStartSLOduration=4.798130472 podStartE2EDuration="4.798130472s" podCreationTimestamp="2026-01-10 16:45:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:45:56.791702988 +0000 UTC m=+1078.661938472" watchObservedRunningTime="2026-01-10 16:45:56.798130472 +0000 UTC m=+1078.668365966" Jan 10 16:45:56 crc kubenswrapper[5036]: I0110 16:45:56.821633 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.707332362 podStartE2EDuration="5.821608822s" podCreationTimestamp="2026-01-10 16:45:51 +0000 UTC" firstStartedPulling="2026-01-10 16:45:52.647750276 +0000 UTC m=+1074.517985770" lastFinishedPulling="2026-01-10 16:45:55.762026736 +0000 UTC m=+1077.632262230" observedRunningTime="2026-01-10 16:45:56.812885903 +0000 UTC m=+1078.683121397" watchObservedRunningTime="2026-01-10 16:45:56.821608822 +0000 UTC m=+1078.691844316" Jan 10 16:45:56 crc kubenswrapper[5036]: I0110 16:45:56.842131 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.979843604 podStartE2EDuration="5.842116778s" podCreationTimestamp="2026-01-10 16:45:51 +0000 UTC" firstStartedPulling="2026-01-10 16:45:52.928645597 +0000 UTC m=+1074.798881091" lastFinishedPulling="2026-01-10 16:45:55.790918771 +0000 UTC m=+1077.661154265" observedRunningTime="2026-01-10 16:45:56.834309875 +0000 UTC m=+1078.704545369" watchObservedRunningTime="2026-01-10 16:45:56.842116778 +0000 UTC m=+1078.712352272" Jan 10 16:45:56 crc kubenswrapper[5036]: I0110 16:45:56.945739 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.148388 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.148450 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.163739 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.601811 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.708117 5036 generic.go:334] "Generic (PLEG): container finished" podID="3264c467-4991-43e7-b6e4-c69ec14d37b9" containerID="bd8424647110934d8f01eec2e56932f6b23b10a4b471ed43041ed4bda3211301" exitCode=0 Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.708150 5036 generic.go:334] "Generic (PLEG): container finished" podID="3264c467-4991-43e7-b6e4-c69ec14d37b9" containerID="49167cf1212ddc8304d8045d25926038e11bc0b9420c18ef48038f1ad6a7bc53" exitCode=143 Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.708183 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.708254 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3264c467-4991-43e7-b6e4-c69ec14d37b9","Type":"ContainerDied","Data":"bd8424647110934d8f01eec2e56932f6b23b10a4b471ed43041ed4bda3211301"} Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.708320 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3264c467-4991-43e7-b6e4-c69ec14d37b9","Type":"ContainerDied","Data":"49167cf1212ddc8304d8045d25926038e11bc0b9420c18ef48038f1ad6a7bc53"} Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.708340 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3264c467-4991-43e7-b6e4-c69ec14d37b9","Type":"ContainerDied","Data":"34fe13aa7a0dc4c70a37e636ad03b672e0c4fc33d3ec73a9bfd0ef49086ffde3"} Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.708361 5036 scope.go:117] "RemoveContainer" containerID="bd8424647110934d8f01eec2e56932f6b23b10a4b471ed43041ed4bda3211301" Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.742484 5036 scope.go:117] "RemoveContainer" containerID="49167cf1212ddc8304d8045d25926038e11bc0b9420c18ef48038f1ad6a7bc53" Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.745885 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3264c467-4991-43e7-b6e4-c69ec14d37b9-combined-ca-bundle\") pod \"3264c467-4991-43e7-b6e4-c69ec14d37b9\" (UID: \"3264c467-4991-43e7-b6e4-c69ec14d37b9\") " Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.745998 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3264c467-4991-43e7-b6e4-c69ec14d37b9-config-data\") pod \"3264c467-4991-43e7-b6e4-c69ec14d37b9\" (UID: \"3264c467-4991-43e7-b6e4-c69ec14d37b9\") " Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.746060 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-826kc\" (UniqueName: \"kubernetes.io/projected/3264c467-4991-43e7-b6e4-c69ec14d37b9-kube-api-access-826kc\") pod \"3264c467-4991-43e7-b6e4-c69ec14d37b9\" (UID: \"3264c467-4991-43e7-b6e4-c69ec14d37b9\") " Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.746149 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3264c467-4991-43e7-b6e4-c69ec14d37b9-logs\") pod \"3264c467-4991-43e7-b6e4-c69ec14d37b9\" (UID: \"3264c467-4991-43e7-b6e4-c69ec14d37b9\") " Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.763051 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3264c467-4991-43e7-b6e4-c69ec14d37b9-logs" (OuterVolumeSpecName: "logs") pod "3264c467-4991-43e7-b6e4-c69ec14d37b9" (UID: "3264c467-4991-43e7-b6e4-c69ec14d37b9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.763640 5036 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3264c467-4991-43e7-b6e4-c69ec14d37b9-logs\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.763849 5036 scope.go:117] "RemoveContainer" containerID="bd8424647110934d8f01eec2e56932f6b23b10a4b471ed43041ed4bda3211301" Jan 10 16:45:57 crc kubenswrapper[5036]: E0110 16:45:57.773958 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd8424647110934d8f01eec2e56932f6b23b10a4b471ed43041ed4bda3211301\": container with ID starting with bd8424647110934d8f01eec2e56932f6b23b10a4b471ed43041ed4bda3211301 not found: ID does not exist" containerID="bd8424647110934d8f01eec2e56932f6b23b10a4b471ed43041ed4bda3211301" Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.774046 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd8424647110934d8f01eec2e56932f6b23b10a4b471ed43041ed4bda3211301"} err="failed to get container status \"bd8424647110934d8f01eec2e56932f6b23b10a4b471ed43041ed4bda3211301\": rpc error: code = NotFound desc = could not find container \"bd8424647110934d8f01eec2e56932f6b23b10a4b471ed43041ed4bda3211301\": container with ID starting with bd8424647110934d8f01eec2e56932f6b23b10a4b471ed43041ed4bda3211301 not found: ID does not exist" Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.774075 5036 scope.go:117] "RemoveContainer" containerID="49167cf1212ddc8304d8045d25926038e11bc0b9420c18ef48038f1ad6a7bc53" Jan 10 16:45:57 crc kubenswrapper[5036]: E0110 16:45:57.774960 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49167cf1212ddc8304d8045d25926038e11bc0b9420c18ef48038f1ad6a7bc53\": container with ID starting with 49167cf1212ddc8304d8045d25926038e11bc0b9420c18ef48038f1ad6a7bc53 not found: ID does not exist" containerID="49167cf1212ddc8304d8045d25926038e11bc0b9420c18ef48038f1ad6a7bc53" Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.774988 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49167cf1212ddc8304d8045d25926038e11bc0b9420c18ef48038f1ad6a7bc53"} err="failed to get container status \"49167cf1212ddc8304d8045d25926038e11bc0b9420c18ef48038f1ad6a7bc53\": rpc error: code = NotFound desc = could not find container \"49167cf1212ddc8304d8045d25926038e11bc0b9420c18ef48038f1ad6a7bc53\": container with ID starting with 49167cf1212ddc8304d8045d25926038e11bc0b9420c18ef48038f1ad6a7bc53 not found: ID does not exist" Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.775004 5036 scope.go:117] "RemoveContainer" containerID="bd8424647110934d8f01eec2e56932f6b23b10a4b471ed43041ed4bda3211301" Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.775525 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd8424647110934d8f01eec2e56932f6b23b10a4b471ed43041ed4bda3211301"} err="failed to get container status \"bd8424647110934d8f01eec2e56932f6b23b10a4b471ed43041ed4bda3211301\": rpc error: code = NotFound desc = could not find container \"bd8424647110934d8f01eec2e56932f6b23b10a4b471ed43041ed4bda3211301\": container with ID starting with bd8424647110934d8f01eec2e56932f6b23b10a4b471ed43041ed4bda3211301 not found: ID does not exist" Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.775575 5036 scope.go:117] "RemoveContainer" containerID="49167cf1212ddc8304d8045d25926038e11bc0b9420c18ef48038f1ad6a7bc53" Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.776086 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49167cf1212ddc8304d8045d25926038e11bc0b9420c18ef48038f1ad6a7bc53"} err="failed to get container status \"49167cf1212ddc8304d8045d25926038e11bc0b9420c18ef48038f1ad6a7bc53\": rpc error: code = NotFound desc = could not find container \"49167cf1212ddc8304d8045d25926038e11bc0b9420c18ef48038f1ad6a7bc53\": container with ID starting with 49167cf1212ddc8304d8045d25926038e11bc0b9420c18ef48038f1ad6a7bc53 not found: ID does not exist" Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.782622 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3264c467-4991-43e7-b6e4-c69ec14d37b9-kube-api-access-826kc" (OuterVolumeSpecName: "kube-api-access-826kc") pod "3264c467-4991-43e7-b6e4-c69ec14d37b9" (UID: "3264c467-4991-43e7-b6e4-c69ec14d37b9"). InnerVolumeSpecName "kube-api-access-826kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.801179 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3264c467-4991-43e7-b6e4-c69ec14d37b9-config-data" (OuterVolumeSpecName: "config-data") pod "3264c467-4991-43e7-b6e4-c69ec14d37b9" (UID: "3264c467-4991-43e7-b6e4-c69ec14d37b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.827655 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3264c467-4991-43e7-b6e4-c69ec14d37b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3264c467-4991-43e7-b6e4-c69ec14d37b9" (UID: "3264c467-4991-43e7-b6e4-c69ec14d37b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.865223 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3264c467-4991-43e7-b6e4-c69ec14d37b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.865263 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3264c467-4991-43e7-b6e4-c69ec14d37b9-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:57 crc kubenswrapper[5036]: I0110 16:45:57.865276 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-826kc\" (UniqueName: \"kubernetes.io/projected/3264c467-4991-43e7-b6e4-c69ec14d37b9-kube-api-access-826kc\") on node \"crc\" DevicePath \"\"" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.040871 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.056939 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.068077 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 10 16:45:58 crc kubenswrapper[5036]: E0110 16:45:58.068471 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3264c467-4991-43e7-b6e4-c69ec14d37b9" containerName="nova-metadata-log" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.068488 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="3264c467-4991-43e7-b6e4-c69ec14d37b9" containerName="nova-metadata-log" Jan 10 16:45:58 crc kubenswrapper[5036]: E0110 16:45:58.068521 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3264c467-4991-43e7-b6e4-c69ec14d37b9" containerName="nova-metadata-metadata" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.068527 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="3264c467-4991-43e7-b6e4-c69ec14d37b9" containerName="nova-metadata-metadata" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.068698 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="3264c467-4991-43e7-b6e4-c69ec14d37b9" containerName="nova-metadata-metadata" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.068718 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="3264c467-4991-43e7-b6e4-c69ec14d37b9" containerName="nova-metadata-log" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.069925 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.072522 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.072632 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.085826 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.170065 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb875d5f-5f8c-43f9-964e-a805a9132aa3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\") " pod="openstack/nova-metadata-0" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.170143 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb875d5f-5f8c-43f9-964e-a805a9132aa3-config-data\") pod \"nova-metadata-0\" (UID: \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\") " pod="openstack/nova-metadata-0" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.170190 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb875d5f-5f8c-43f9-964e-a805a9132aa3-logs\") pod \"nova-metadata-0\" (UID: \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\") " pod="openstack/nova-metadata-0" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.170223 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtw2k\" (UniqueName: \"kubernetes.io/projected/eb875d5f-5f8c-43f9-964e-a805a9132aa3-kube-api-access-dtw2k\") pod \"nova-metadata-0\" (UID: \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\") " pod="openstack/nova-metadata-0" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.170263 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb875d5f-5f8c-43f9-964e-a805a9132aa3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\") " pod="openstack/nova-metadata-0" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.272157 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb875d5f-5f8c-43f9-964e-a805a9132aa3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\") " pod="openstack/nova-metadata-0" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.272753 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb875d5f-5f8c-43f9-964e-a805a9132aa3-config-data\") pod \"nova-metadata-0\" (UID: \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\") " pod="openstack/nova-metadata-0" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.272812 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb875d5f-5f8c-43f9-964e-a805a9132aa3-logs\") pod \"nova-metadata-0\" (UID: \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\") " pod="openstack/nova-metadata-0" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.272840 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtw2k\" (UniqueName: \"kubernetes.io/projected/eb875d5f-5f8c-43f9-964e-a805a9132aa3-kube-api-access-dtw2k\") pod \"nova-metadata-0\" (UID: \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\") " pod="openstack/nova-metadata-0" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.272880 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb875d5f-5f8c-43f9-964e-a805a9132aa3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\") " pod="openstack/nova-metadata-0" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.273742 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb875d5f-5f8c-43f9-964e-a805a9132aa3-logs\") pod \"nova-metadata-0\" (UID: \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\") " pod="openstack/nova-metadata-0" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.278381 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb875d5f-5f8c-43f9-964e-a805a9132aa3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\") " pod="openstack/nova-metadata-0" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.278507 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb875d5f-5f8c-43f9-964e-a805a9132aa3-config-data\") pod \"nova-metadata-0\" (UID: \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\") " pod="openstack/nova-metadata-0" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.296038 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb875d5f-5f8c-43f9-964e-a805a9132aa3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\") " pod="openstack/nova-metadata-0" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.298415 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtw2k\" (UniqueName: \"kubernetes.io/projected/eb875d5f-5f8c-43f9-964e-a805a9132aa3-kube-api-access-dtw2k\") pod \"nova-metadata-0\" (UID: \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\") " pod="openstack/nova-metadata-0" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.413089 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.520794 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3264c467-4991-43e7-b6e4-c69ec14d37b9" path="/var/lib/kubelet/pods/3264c467-4991-43e7-b6e4-c69ec14d37b9/volumes" Jan 10 16:45:58 crc kubenswrapper[5036]: I0110 16:45:58.925597 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 10 16:45:59 crc kubenswrapper[5036]: I0110 16:45:59.736273 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eb875d5f-5f8c-43f9-964e-a805a9132aa3","Type":"ContainerStarted","Data":"2feafa24de14a1a71d38940e293342b971a3f1cd49c006d99fcdff09b7950e46"} Jan 10 16:45:59 crc kubenswrapper[5036]: I0110 16:45:59.736612 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eb875d5f-5f8c-43f9-964e-a805a9132aa3","Type":"ContainerStarted","Data":"40e9b1f6d05a7dec1e4f536e0ef4873a3fcfc2801a2ba529aedb6f62283aedc7"} Jan 10 16:45:59 crc kubenswrapper[5036]: I0110 16:45:59.736624 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eb875d5f-5f8c-43f9-964e-a805a9132aa3","Type":"ContainerStarted","Data":"b9f2a3f380b845fcf98dbed772f570d79c13930305242b8715760e59597d3dee"} Jan 10 16:45:59 crc kubenswrapper[5036]: I0110 16:45:59.761258 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.761241096 podStartE2EDuration="1.761241096s" podCreationTimestamp="2026-01-10 16:45:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:45:59.753192226 +0000 UTC m=+1081.623427720" watchObservedRunningTime="2026-01-10 16:45:59.761241096 +0000 UTC m=+1081.631476590" Jan 10 16:46:00 crc kubenswrapper[5036]: I0110 16:46:00.759378 5036 generic.go:334] "Generic (PLEG): container finished" podID="4596a8b1-1c76-48fd-8c48-ae9adb6f629e" containerID="a46af9221f40c5ca7a6f8ac24fb548026064eee9253e5e0c792c3663486e9aa2" exitCode=0 Jan 10 16:46:00 crc kubenswrapper[5036]: I0110 16:46:00.759428 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lzdzf" event={"ID":"4596a8b1-1c76-48fd-8c48-ae9adb6f629e","Type":"ContainerDied","Data":"a46af9221f40c5ca7a6f8ac24fb548026064eee9253e5e0c792c3663486e9aa2"} Jan 10 16:46:01 crc kubenswrapper[5036]: I0110 16:46:01.933724 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 10 16:46:01 crc kubenswrapper[5036]: I0110 16:46:01.935847 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.156667 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lzdzf" Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.162937 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.203642 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.248567 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4596a8b1-1c76-48fd-8c48-ae9adb6f629e-scripts\") pod \"4596a8b1-1c76-48fd-8c48-ae9adb6f629e\" (UID: \"4596a8b1-1c76-48fd-8c48-ae9adb6f629e\") " Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.248802 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4596a8b1-1c76-48fd-8c48-ae9adb6f629e-combined-ca-bundle\") pod \"4596a8b1-1c76-48fd-8c48-ae9adb6f629e\" (UID: \"4596a8b1-1c76-48fd-8c48-ae9adb6f629e\") " Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.248864 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4596a8b1-1c76-48fd-8c48-ae9adb6f629e-config-data\") pod \"4596a8b1-1c76-48fd-8c48-ae9adb6f629e\" (UID: \"4596a8b1-1c76-48fd-8c48-ae9adb6f629e\") " Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.248954 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc4vm\" (UniqueName: \"kubernetes.io/projected/4596a8b1-1c76-48fd-8c48-ae9adb6f629e-kube-api-access-zc4vm\") pod \"4596a8b1-1c76-48fd-8c48-ae9adb6f629e\" (UID: \"4596a8b1-1c76-48fd-8c48-ae9adb6f629e\") " Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.259823 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4596a8b1-1c76-48fd-8c48-ae9adb6f629e-scripts" (OuterVolumeSpecName: "scripts") pod "4596a8b1-1c76-48fd-8c48-ae9adb6f629e" (UID: "4596a8b1-1c76-48fd-8c48-ae9adb6f629e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.259878 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4596a8b1-1c76-48fd-8c48-ae9adb6f629e-kube-api-access-zc4vm" (OuterVolumeSpecName: "kube-api-access-zc4vm") pod "4596a8b1-1c76-48fd-8c48-ae9adb6f629e" (UID: "4596a8b1-1c76-48fd-8c48-ae9adb6f629e"). InnerVolumeSpecName "kube-api-access-zc4vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.273612 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4596a8b1-1c76-48fd-8c48-ae9adb6f629e-config-data" (OuterVolumeSpecName: "config-data") pod "4596a8b1-1c76-48fd-8c48-ae9adb6f629e" (UID: "4596a8b1-1c76-48fd-8c48-ae9adb6f629e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.279099 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4596a8b1-1c76-48fd-8c48-ae9adb6f629e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4596a8b1-1c76-48fd-8c48-ae9adb6f629e" (UID: "4596a8b1-1c76-48fd-8c48-ae9adb6f629e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.313740 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.351040 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc4vm\" (UniqueName: \"kubernetes.io/projected/4596a8b1-1c76-48fd-8c48-ae9adb6f629e-kube-api-access-zc4vm\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.351073 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4596a8b1-1c76-48fd-8c48-ae9adb6f629e-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.351084 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4596a8b1-1c76-48fd-8c48-ae9adb6f629e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.351094 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4596a8b1-1c76-48fd-8c48-ae9adb6f629e-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.386108 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-zv5r2"] Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.386381 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" podUID="68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac" containerName="dnsmasq-dns" containerID="cri-o://9ecf3be69c813eac55bc3cabc940b42fb5baf783c6673aee83b2d6b3f92f966b" gracePeriod=10 Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.778782 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lzdzf" event={"ID":"4596a8b1-1c76-48fd-8c48-ae9adb6f629e","Type":"ContainerDied","Data":"8961553af9f964fb86eb5ffeed6b193c936afa7bc789bfe4b54fac72d542ae54"} Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.779028 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8961553af9f964fb86eb5ffeed6b193c936afa7bc789bfe4b54fac72d542ae54" Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.778824 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lzdzf" Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.781471 5036 generic.go:334] "Generic (PLEG): container finished" podID="68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac" containerID="9ecf3be69c813eac55bc3cabc940b42fb5baf783c6673aee83b2d6b3f92f966b" exitCode=0 Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.781569 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" event={"ID":"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac","Type":"ContainerDied","Data":"9ecf3be69c813eac55bc3cabc940b42fb5baf783c6673aee83b2d6b3f92f966b"} Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.836835 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.849023 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.961161 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-877bb\" (UniqueName: \"kubernetes.io/projected/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-kube-api-access-877bb\") pod \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\" (UID: \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\") " Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.961223 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-config\") pod \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\" (UID: \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\") " Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.961256 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-ovsdbserver-sb\") pod \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\" (UID: \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\") " Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.961403 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-dns-svc\") pod \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\" (UID: \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\") " Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.961477 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-ovsdbserver-nb\") pod \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\" (UID: \"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac\") " Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.965705 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.976300 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3dfc89f0-28a3-482f-8e8a-80d23e14c53a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.171:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.976437 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3dfc89f0-28a3-482f-8e8a-80d23e14c53a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.171:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 10 16:46:02 crc kubenswrapper[5036]: I0110 16:46:02.979180 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-kube-api-access-877bb" (OuterVolumeSpecName: "kube-api-access-877bb") pod "68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac" (UID: "68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac"). InnerVolumeSpecName "kube-api-access-877bb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.006033 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.006332 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="eb875d5f-5f8c-43f9-964e-a805a9132aa3" containerName="nova-metadata-log" containerID="cri-o://40e9b1f6d05a7dec1e4f536e0ef4873a3fcfc2801a2ba529aedb6f62283aedc7" gracePeriod=30 Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.006482 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="eb875d5f-5f8c-43f9-964e-a805a9132aa3" containerName="nova-metadata-metadata" containerID="cri-o://2feafa24de14a1a71d38940e293342b971a3f1cd49c006d99fcdff09b7950e46" gracePeriod=30 Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.015402 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac" (UID: "68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.029709 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-config" (OuterVolumeSpecName: "config") pod "68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac" (UID: "68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.046089 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac" (UID: "68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.063180 5036 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.063210 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-877bb\" (UniqueName: \"kubernetes.io/projected/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-kube-api-access-877bb\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.063222 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.063231 5036 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.066210 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac" (UID: "68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.165712 5036 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.380812 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.437656 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.437787 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.780703 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.791276 5036 generic.go:334] "Generic (PLEG): container finished" podID="eb875d5f-5f8c-43f9-964e-a805a9132aa3" containerID="2feafa24de14a1a71d38940e293342b971a3f1cd49c006d99fcdff09b7950e46" exitCode=0 Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.791313 5036 generic.go:334] "Generic (PLEG): container finished" podID="eb875d5f-5f8c-43f9-964e-a805a9132aa3" containerID="40e9b1f6d05a7dec1e4f536e0ef4873a3fcfc2801a2ba529aedb6f62283aedc7" exitCode=143 Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.791326 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.791367 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eb875d5f-5f8c-43f9-964e-a805a9132aa3","Type":"ContainerDied","Data":"2feafa24de14a1a71d38940e293342b971a3f1cd49c006d99fcdff09b7950e46"} Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.791418 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eb875d5f-5f8c-43f9-964e-a805a9132aa3","Type":"ContainerDied","Data":"40e9b1f6d05a7dec1e4f536e0ef4873a3fcfc2801a2ba529aedb6f62283aedc7"} Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.791430 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eb875d5f-5f8c-43f9-964e-a805a9132aa3","Type":"ContainerDied","Data":"b9f2a3f380b845fcf98dbed772f570d79c13930305242b8715760e59597d3dee"} Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.791441 5036 scope.go:117] "RemoveContainer" containerID="2feafa24de14a1a71d38940e293342b971a3f1cd49c006d99fcdff09b7950e46" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.793484 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" event={"ID":"68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac","Type":"ContainerDied","Data":"0ea23acb1b5f4d0fef3c4f9b97ca551ff6835da5b1a12cf653c4d7aa73b1ee5c"} Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.793536 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-zv5r2" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.794875 5036 generic.go:334] "Generic (PLEG): container finished" podID="7d1109f9-6187-4b88-bb21-c43f2b25b4ad" containerID="fa1da74453138d18273be89397bb33897347520e37fcd326e2f85eaf96f7237c" exitCode=0 Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.795027 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3dfc89f0-28a3-482f-8e8a-80d23e14c53a" containerName="nova-api-log" containerID="cri-o://edd5c154470e69955a20788b278cb256aa5c00b3ffc4be2cd299a8a7f9a489dc" gracePeriod=30 Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.795237 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-282lj" event={"ID":"7d1109f9-6187-4b88-bb21-c43f2b25b4ad","Type":"ContainerDied","Data":"fa1da74453138d18273be89397bb33897347520e37fcd326e2f85eaf96f7237c"} Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.795920 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3dfc89f0-28a3-482f-8e8a-80d23e14c53a" containerName="nova-api-api" containerID="cri-o://fe2924cdc97f368f7a4a4d0b1260805c4b83a480f39f57798b5799b9fc70a499" gracePeriod=30 Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.832000 5036 scope.go:117] "RemoveContainer" containerID="40e9b1f6d05a7dec1e4f536e0ef4873a3fcfc2801a2ba529aedb6f62283aedc7" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.860610 5036 scope.go:117] "RemoveContainer" containerID="2feafa24de14a1a71d38940e293342b971a3f1cd49c006d99fcdff09b7950e46" Jan 10 16:46:03 crc kubenswrapper[5036]: E0110 16:46:03.861078 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2feafa24de14a1a71d38940e293342b971a3f1cd49c006d99fcdff09b7950e46\": container with ID starting with 2feafa24de14a1a71d38940e293342b971a3f1cd49c006d99fcdff09b7950e46 not found: ID does not exist" containerID="2feafa24de14a1a71d38940e293342b971a3f1cd49c006d99fcdff09b7950e46" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.861113 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2feafa24de14a1a71d38940e293342b971a3f1cd49c006d99fcdff09b7950e46"} err="failed to get container status \"2feafa24de14a1a71d38940e293342b971a3f1cd49c006d99fcdff09b7950e46\": rpc error: code = NotFound desc = could not find container \"2feafa24de14a1a71d38940e293342b971a3f1cd49c006d99fcdff09b7950e46\": container with ID starting with 2feafa24de14a1a71d38940e293342b971a3f1cd49c006d99fcdff09b7950e46 not found: ID does not exist" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.861134 5036 scope.go:117] "RemoveContainer" containerID="40e9b1f6d05a7dec1e4f536e0ef4873a3fcfc2801a2ba529aedb6f62283aedc7" Jan 10 16:46:03 crc kubenswrapper[5036]: E0110 16:46:03.861337 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40e9b1f6d05a7dec1e4f536e0ef4873a3fcfc2801a2ba529aedb6f62283aedc7\": container with ID starting with 40e9b1f6d05a7dec1e4f536e0ef4873a3fcfc2801a2ba529aedb6f62283aedc7 not found: ID does not exist" containerID="40e9b1f6d05a7dec1e4f536e0ef4873a3fcfc2801a2ba529aedb6f62283aedc7" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.861359 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40e9b1f6d05a7dec1e4f536e0ef4873a3fcfc2801a2ba529aedb6f62283aedc7"} err="failed to get container status \"40e9b1f6d05a7dec1e4f536e0ef4873a3fcfc2801a2ba529aedb6f62283aedc7\": rpc error: code = NotFound desc = could not find container \"40e9b1f6d05a7dec1e4f536e0ef4873a3fcfc2801a2ba529aedb6f62283aedc7\": container with ID starting with 40e9b1f6d05a7dec1e4f536e0ef4873a3fcfc2801a2ba529aedb6f62283aedc7 not found: ID does not exist" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.861372 5036 scope.go:117] "RemoveContainer" containerID="2feafa24de14a1a71d38940e293342b971a3f1cd49c006d99fcdff09b7950e46" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.861566 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2feafa24de14a1a71d38940e293342b971a3f1cd49c006d99fcdff09b7950e46"} err="failed to get container status \"2feafa24de14a1a71d38940e293342b971a3f1cd49c006d99fcdff09b7950e46\": rpc error: code = NotFound desc = could not find container \"2feafa24de14a1a71d38940e293342b971a3f1cd49c006d99fcdff09b7950e46\": container with ID starting with 2feafa24de14a1a71d38940e293342b971a3f1cd49c006d99fcdff09b7950e46 not found: ID does not exist" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.861585 5036 scope.go:117] "RemoveContainer" containerID="40e9b1f6d05a7dec1e4f536e0ef4873a3fcfc2801a2ba529aedb6f62283aedc7" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.861808 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40e9b1f6d05a7dec1e4f536e0ef4873a3fcfc2801a2ba529aedb6f62283aedc7"} err="failed to get container status \"40e9b1f6d05a7dec1e4f536e0ef4873a3fcfc2801a2ba529aedb6f62283aedc7\": rpc error: code = NotFound desc = could not find container \"40e9b1f6d05a7dec1e4f536e0ef4873a3fcfc2801a2ba529aedb6f62283aedc7\": container with ID starting with 40e9b1f6d05a7dec1e4f536e0ef4873a3fcfc2801a2ba529aedb6f62283aedc7 not found: ID does not exist" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.861827 5036 scope.go:117] "RemoveContainer" containerID="9ecf3be69c813eac55bc3cabc940b42fb5baf783c6673aee83b2d6b3f92f966b" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.883962 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb875d5f-5f8c-43f9-964e-a805a9132aa3-logs\") pod \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\" (UID: \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\") " Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.884073 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtw2k\" (UniqueName: \"kubernetes.io/projected/eb875d5f-5f8c-43f9-964e-a805a9132aa3-kube-api-access-dtw2k\") pod \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\" (UID: \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\") " Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.884161 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb875d5f-5f8c-43f9-964e-a805a9132aa3-config-data\") pod \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\" (UID: \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\") " Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.884195 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb875d5f-5f8c-43f9-964e-a805a9132aa3-nova-metadata-tls-certs\") pod \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\" (UID: \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\") " Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.884274 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb875d5f-5f8c-43f9-964e-a805a9132aa3-combined-ca-bundle\") pod \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\" (UID: \"eb875d5f-5f8c-43f9-964e-a805a9132aa3\") " Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.886395 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb875d5f-5f8c-43f9-964e-a805a9132aa3-logs" (OuterVolumeSpecName: "logs") pod "eb875d5f-5f8c-43f9-964e-a805a9132aa3" (UID: "eb875d5f-5f8c-43f9-964e-a805a9132aa3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.891064 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb875d5f-5f8c-43f9-964e-a805a9132aa3-kube-api-access-dtw2k" (OuterVolumeSpecName: "kube-api-access-dtw2k") pod "eb875d5f-5f8c-43f9-964e-a805a9132aa3" (UID: "eb875d5f-5f8c-43f9-964e-a805a9132aa3"). InnerVolumeSpecName "kube-api-access-dtw2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.891138 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-zv5r2"] Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.902881 5036 scope.go:117] "RemoveContainer" containerID="e81520bc27313e433ac11a4c5115d433794d2013edc4e2669d381b538e0e9098" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.907627 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-zv5r2"] Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.919365 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb875d5f-5f8c-43f9-964e-a805a9132aa3-config-data" (OuterVolumeSpecName: "config-data") pod "eb875d5f-5f8c-43f9-964e-a805a9132aa3" (UID: "eb875d5f-5f8c-43f9-964e-a805a9132aa3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.927466 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb875d5f-5f8c-43f9-964e-a805a9132aa3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb875d5f-5f8c-43f9-964e-a805a9132aa3" (UID: "eb875d5f-5f8c-43f9-964e-a805a9132aa3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.934363 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb875d5f-5f8c-43f9-964e-a805a9132aa3-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "eb875d5f-5f8c-43f9-964e-a805a9132aa3" (UID: "eb875d5f-5f8c-43f9-964e-a805a9132aa3"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.988838 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb875d5f-5f8c-43f9-964e-a805a9132aa3-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.989585 5036 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb875d5f-5f8c-43f9-964e-a805a9132aa3-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.989607 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb875d5f-5f8c-43f9-964e-a805a9132aa3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.989621 5036 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb875d5f-5f8c-43f9-964e-a805a9132aa3-logs\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:03 crc kubenswrapper[5036]: I0110 16:46:03.989631 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtw2k\" (UniqueName: \"kubernetes.io/projected/eb875d5f-5f8c-43f9-964e-a805a9132aa3-kube-api-access-dtw2k\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.132024 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.139616 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.155992 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 10 16:46:04 crc kubenswrapper[5036]: E0110 16:46:04.156363 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb875d5f-5f8c-43f9-964e-a805a9132aa3" containerName="nova-metadata-log" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.156378 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb875d5f-5f8c-43f9-964e-a805a9132aa3" containerName="nova-metadata-log" Jan 10 16:46:04 crc kubenswrapper[5036]: E0110 16:46:04.156395 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb875d5f-5f8c-43f9-964e-a805a9132aa3" containerName="nova-metadata-metadata" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.156400 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb875d5f-5f8c-43f9-964e-a805a9132aa3" containerName="nova-metadata-metadata" Jan 10 16:46:04 crc kubenswrapper[5036]: E0110 16:46:04.156417 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac" containerName="dnsmasq-dns" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.156423 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac" containerName="dnsmasq-dns" Jan 10 16:46:04 crc kubenswrapper[5036]: E0110 16:46:04.156432 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac" containerName="init" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.156438 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac" containerName="init" Jan 10 16:46:04 crc kubenswrapper[5036]: E0110 16:46:04.156458 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4596a8b1-1c76-48fd-8c48-ae9adb6f629e" containerName="nova-manage" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.156464 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="4596a8b1-1c76-48fd-8c48-ae9adb6f629e" containerName="nova-manage" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.156613 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="4596a8b1-1c76-48fd-8c48-ae9adb6f629e" containerName="nova-manage" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.156625 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac" containerName="dnsmasq-dns" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.156636 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb875d5f-5f8c-43f9-964e-a805a9132aa3" containerName="nova-metadata-log" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.156644 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb875d5f-5f8c-43f9-964e-a805a9132aa3" containerName="nova-metadata-metadata" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.157560 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.161690 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.161865 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.163295 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.295040 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsqdm\" (UniqueName: \"kubernetes.io/projected/8ebf6309-92e5-4223-93b6-93138eb0c7e5-kube-api-access-tsqdm\") pod \"nova-metadata-0\" (UID: \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\") " pod="openstack/nova-metadata-0" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.295093 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ebf6309-92e5-4223-93b6-93138eb0c7e5-logs\") pod \"nova-metadata-0\" (UID: \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\") " pod="openstack/nova-metadata-0" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.295111 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ebf6309-92e5-4223-93b6-93138eb0c7e5-config-data\") pod \"nova-metadata-0\" (UID: \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\") " pod="openstack/nova-metadata-0" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.295160 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ebf6309-92e5-4223-93b6-93138eb0c7e5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\") " pod="openstack/nova-metadata-0" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.295424 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ebf6309-92e5-4223-93b6-93138eb0c7e5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\") " pod="openstack/nova-metadata-0" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.397372 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ebf6309-92e5-4223-93b6-93138eb0c7e5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\") " pod="openstack/nova-metadata-0" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.397487 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsqdm\" (UniqueName: \"kubernetes.io/projected/8ebf6309-92e5-4223-93b6-93138eb0c7e5-kube-api-access-tsqdm\") pod \"nova-metadata-0\" (UID: \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\") " pod="openstack/nova-metadata-0" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.397510 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ebf6309-92e5-4223-93b6-93138eb0c7e5-logs\") pod \"nova-metadata-0\" (UID: \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\") " pod="openstack/nova-metadata-0" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.397538 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ebf6309-92e5-4223-93b6-93138eb0c7e5-config-data\") pod \"nova-metadata-0\" (UID: \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\") " pod="openstack/nova-metadata-0" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.397589 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ebf6309-92e5-4223-93b6-93138eb0c7e5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\") " pod="openstack/nova-metadata-0" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.398060 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ebf6309-92e5-4223-93b6-93138eb0c7e5-logs\") pod \"nova-metadata-0\" (UID: \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\") " pod="openstack/nova-metadata-0" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.404316 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ebf6309-92e5-4223-93b6-93138eb0c7e5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\") " pod="openstack/nova-metadata-0" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.405619 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ebf6309-92e5-4223-93b6-93138eb0c7e5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\") " pod="openstack/nova-metadata-0" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.405712 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ebf6309-92e5-4223-93b6-93138eb0c7e5-config-data\") pod \"nova-metadata-0\" (UID: \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\") " pod="openstack/nova-metadata-0" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.419918 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsqdm\" (UniqueName: \"kubernetes.io/projected/8ebf6309-92e5-4223-93b6-93138eb0c7e5-kube-api-access-tsqdm\") pod \"nova-metadata-0\" (UID: \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\") " pod="openstack/nova-metadata-0" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.477290 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.526629 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac" path="/var/lib/kubelet/pods/68470a7f-9fcd-43ac-ae49-7db9ae4ac0ac/volumes" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.527574 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb875d5f-5f8c-43f9-964e-a805a9132aa3" path="/var/lib/kubelet/pods/eb875d5f-5f8c-43f9-964e-a805a9132aa3/volumes" Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.808446 5036 generic.go:334] "Generic (PLEG): container finished" podID="3dfc89f0-28a3-482f-8e8a-80d23e14c53a" containerID="edd5c154470e69955a20788b278cb256aa5c00b3ffc4be2cd299a8a7f9a489dc" exitCode=143 Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.808520 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3dfc89f0-28a3-482f-8e8a-80d23e14c53a","Type":"ContainerDied","Data":"edd5c154470e69955a20788b278cb256aa5c00b3ffc4be2cd299a8a7f9a489dc"} Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.808908 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="04f0748c-4f3c-4636-88ca-f158e22015b2" containerName="nova-scheduler-scheduler" containerID="cri-o://f0b1454a877cbcbc763fdc0cd07a7831c8b5ebfc2b424f9fb799863628ef47d6" gracePeriod=30 Jan 10 16:46:04 crc kubenswrapper[5036]: I0110 16:46:04.933455 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.247532 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-282lj" Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.312453 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d1109f9-6187-4b88-bb21-c43f2b25b4ad-combined-ca-bundle\") pod \"7d1109f9-6187-4b88-bb21-c43f2b25b4ad\" (UID: \"7d1109f9-6187-4b88-bb21-c43f2b25b4ad\") " Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.312855 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz2jq\" (UniqueName: \"kubernetes.io/projected/7d1109f9-6187-4b88-bb21-c43f2b25b4ad-kube-api-access-xz2jq\") pod \"7d1109f9-6187-4b88-bb21-c43f2b25b4ad\" (UID: \"7d1109f9-6187-4b88-bb21-c43f2b25b4ad\") " Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.312943 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d1109f9-6187-4b88-bb21-c43f2b25b4ad-config-data\") pod \"7d1109f9-6187-4b88-bb21-c43f2b25b4ad\" (UID: \"7d1109f9-6187-4b88-bb21-c43f2b25b4ad\") " Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.313022 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d1109f9-6187-4b88-bb21-c43f2b25b4ad-scripts\") pod \"7d1109f9-6187-4b88-bb21-c43f2b25b4ad\" (UID: \"7d1109f9-6187-4b88-bb21-c43f2b25b4ad\") " Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.316485 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d1109f9-6187-4b88-bb21-c43f2b25b4ad-kube-api-access-xz2jq" (OuterVolumeSpecName: "kube-api-access-xz2jq") pod "7d1109f9-6187-4b88-bb21-c43f2b25b4ad" (UID: "7d1109f9-6187-4b88-bb21-c43f2b25b4ad"). InnerVolumeSpecName "kube-api-access-xz2jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.328180 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d1109f9-6187-4b88-bb21-c43f2b25b4ad-scripts" (OuterVolumeSpecName: "scripts") pod "7d1109f9-6187-4b88-bb21-c43f2b25b4ad" (UID: "7d1109f9-6187-4b88-bb21-c43f2b25b4ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.350696 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d1109f9-6187-4b88-bb21-c43f2b25b4ad-config-data" (OuterVolumeSpecName: "config-data") pod "7d1109f9-6187-4b88-bb21-c43f2b25b4ad" (UID: "7d1109f9-6187-4b88-bb21-c43f2b25b4ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.361068 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d1109f9-6187-4b88-bb21-c43f2b25b4ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d1109f9-6187-4b88-bb21-c43f2b25b4ad" (UID: "7d1109f9-6187-4b88-bb21-c43f2b25b4ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.415662 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz2jq\" (UniqueName: \"kubernetes.io/projected/7d1109f9-6187-4b88-bb21-c43f2b25b4ad-kube-api-access-xz2jq\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.415711 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d1109f9-6187-4b88-bb21-c43f2b25b4ad-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.415724 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d1109f9-6187-4b88-bb21-c43f2b25b4ad-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.415736 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d1109f9-6187-4b88-bb21-c43f2b25b4ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.820082 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-282lj" Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.820084 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-282lj" event={"ID":"7d1109f9-6187-4b88-bb21-c43f2b25b4ad","Type":"ContainerDied","Data":"d2961e94159758729383c8d00e11e6cb5f935da214ede77b69eac364420b4c67"} Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.820471 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2961e94159758729383c8d00e11e6cb5f935da214ede77b69eac364420b4c67" Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.822807 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8ebf6309-92e5-4223-93b6-93138eb0c7e5","Type":"ContainerStarted","Data":"7e183f3db963e65b3bc700aa9511a2abaa10b950452acb7bcb64b21b470218ac"} Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.822830 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8ebf6309-92e5-4223-93b6-93138eb0c7e5","Type":"ContainerStarted","Data":"882f394475a1388cdcb8effafc58c3fa2f30eca6637ce2c21e38ead5a0940586"} Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.822839 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8ebf6309-92e5-4223-93b6-93138eb0c7e5","Type":"ContainerStarted","Data":"8fe144268f39591c54dd996a66d3da78191a39ff8afa01528fe668be420bdfe9"} Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.850581 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.85055968 podStartE2EDuration="1.85055968s" podCreationTimestamp="2026-01-10 16:46:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:46:05.845122515 +0000 UTC m=+1087.715358009" watchObservedRunningTime="2026-01-10 16:46:05.85055968 +0000 UTC m=+1087.720795174" Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.937217 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 10 16:46:05 crc kubenswrapper[5036]: E0110 16:46:05.937817 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d1109f9-6187-4b88-bb21-c43f2b25b4ad" containerName="nova-cell1-conductor-db-sync" Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.937840 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d1109f9-6187-4b88-bb21-c43f2b25b4ad" containerName="nova-cell1-conductor-db-sync" Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.938063 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d1109f9-6187-4b88-bb21-c43f2b25b4ad" containerName="nova-cell1-conductor-db-sync" Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.938687 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.943468 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 10 16:46:05 crc kubenswrapper[5036]: I0110 16:46:05.947946 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 10 16:46:06 crc kubenswrapper[5036]: I0110 16:46:06.024958 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1de05ac5-ff01-445f-b1a8-41a7db2a70c4-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1de05ac5-ff01-445f-b1a8-41a7db2a70c4\") " pod="openstack/nova-cell1-conductor-0" Jan 10 16:46:06 crc kubenswrapper[5036]: I0110 16:46:06.025053 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1de05ac5-ff01-445f-b1a8-41a7db2a70c4-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1de05ac5-ff01-445f-b1a8-41a7db2a70c4\") " pod="openstack/nova-cell1-conductor-0" Jan 10 16:46:06 crc kubenswrapper[5036]: I0110 16:46:06.025081 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwbtm\" (UniqueName: \"kubernetes.io/projected/1de05ac5-ff01-445f-b1a8-41a7db2a70c4-kube-api-access-mwbtm\") pod \"nova-cell1-conductor-0\" (UID: \"1de05ac5-ff01-445f-b1a8-41a7db2a70c4\") " pod="openstack/nova-cell1-conductor-0" Jan 10 16:46:06 crc kubenswrapper[5036]: I0110 16:46:06.126449 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1de05ac5-ff01-445f-b1a8-41a7db2a70c4-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1de05ac5-ff01-445f-b1a8-41a7db2a70c4\") " pod="openstack/nova-cell1-conductor-0" Jan 10 16:46:06 crc kubenswrapper[5036]: I0110 16:46:06.126522 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwbtm\" (UniqueName: \"kubernetes.io/projected/1de05ac5-ff01-445f-b1a8-41a7db2a70c4-kube-api-access-mwbtm\") pod \"nova-cell1-conductor-0\" (UID: \"1de05ac5-ff01-445f-b1a8-41a7db2a70c4\") " pod="openstack/nova-cell1-conductor-0" Jan 10 16:46:06 crc kubenswrapper[5036]: I0110 16:46:06.127471 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1de05ac5-ff01-445f-b1a8-41a7db2a70c4-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1de05ac5-ff01-445f-b1a8-41a7db2a70c4\") " pod="openstack/nova-cell1-conductor-0" Jan 10 16:46:06 crc kubenswrapper[5036]: I0110 16:46:06.131507 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1de05ac5-ff01-445f-b1a8-41a7db2a70c4-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1de05ac5-ff01-445f-b1a8-41a7db2a70c4\") " pod="openstack/nova-cell1-conductor-0" Jan 10 16:46:06 crc kubenswrapper[5036]: I0110 16:46:06.135570 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1de05ac5-ff01-445f-b1a8-41a7db2a70c4-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1de05ac5-ff01-445f-b1a8-41a7db2a70c4\") " pod="openstack/nova-cell1-conductor-0" Jan 10 16:46:06 crc kubenswrapper[5036]: I0110 16:46:06.143496 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwbtm\" (UniqueName: \"kubernetes.io/projected/1de05ac5-ff01-445f-b1a8-41a7db2a70c4-kube-api-access-mwbtm\") pod \"nova-cell1-conductor-0\" (UID: \"1de05ac5-ff01-445f-b1a8-41a7db2a70c4\") " pod="openstack/nova-cell1-conductor-0" Jan 10 16:46:06 crc kubenswrapper[5036]: I0110 16:46:06.258645 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 10 16:46:06 crc kubenswrapper[5036]: I0110 16:46:06.704438 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 10 16:46:06 crc kubenswrapper[5036]: I0110 16:46:06.835190 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1de05ac5-ff01-445f-b1a8-41a7db2a70c4","Type":"ContainerStarted","Data":"e198a35a1462a99a21e2bac4d2f34e2bd03bae5061cad147c1541082c28bc7ec"} Jan 10 16:46:07 crc kubenswrapper[5036]: E0110 16:46:07.166815 5036 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f0b1454a877cbcbc763fdc0cd07a7831c8b5ebfc2b424f9fb799863628ef47d6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 10 16:46:07 crc kubenswrapper[5036]: E0110 16:46:07.169858 5036 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f0b1454a877cbcbc763fdc0cd07a7831c8b5ebfc2b424f9fb799863628ef47d6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 10 16:46:07 crc kubenswrapper[5036]: E0110 16:46:07.171992 5036 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f0b1454a877cbcbc763fdc0cd07a7831c8b5ebfc2b424f9fb799863628ef47d6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 10 16:46:07 crc kubenswrapper[5036]: E0110 16:46:07.172081 5036 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="04f0748c-4f3c-4636-88ca-f158e22015b2" containerName="nova-scheduler-scheduler" Jan 10 16:46:07 crc kubenswrapper[5036]: I0110 16:46:07.884979 5036 generic.go:334] "Generic (PLEG): container finished" podID="04f0748c-4f3c-4636-88ca-f158e22015b2" containerID="f0b1454a877cbcbc763fdc0cd07a7831c8b5ebfc2b424f9fb799863628ef47d6" exitCode=0 Jan 10 16:46:07 crc kubenswrapper[5036]: I0110 16:46:07.885068 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04f0748c-4f3c-4636-88ca-f158e22015b2","Type":"ContainerDied","Data":"f0b1454a877cbcbc763fdc0cd07a7831c8b5ebfc2b424f9fb799863628ef47d6"} Jan 10 16:46:07 crc kubenswrapper[5036]: I0110 16:46:07.896990 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1de05ac5-ff01-445f-b1a8-41a7db2a70c4","Type":"ContainerStarted","Data":"23795508888f4f34436fe664ca519d4f5dbd7bda0de8b866d4647bc35d959aa6"} Jan 10 16:46:07 crc kubenswrapper[5036]: I0110 16:46:07.897132 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 10 16:46:07 crc kubenswrapper[5036]: I0110 16:46:07.922470 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.922449224 podStartE2EDuration="2.922449224s" podCreationTimestamp="2026-01-10 16:46:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:46:07.919362006 +0000 UTC m=+1089.789597500" watchObservedRunningTime="2026-01-10 16:46:07.922449224 +0000 UTC m=+1089.792684718" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.352194 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.480930 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04f0748c-4f3c-4636-88ca-f158e22015b2-combined-ca-bundle\") pod \"04f0748c-4f3c-4636-88ca-f158e22015b2\" (UID: \"04f0748c-4f3c-4636-88ca-f158e22015b2\") " Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.481025 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04f0748c-4f3c-4636-88ca-f158e22015b2-config-data\") pod \"04f0748c-4f3c-4636-88ca-f158e22015b2\" (UID: \"04f0748c-4f3c-4636-88ca-f158e22015b2\") " Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.481147 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt7z5\" (UniqueName: \"kubernetes.io/projected/04f0748c-4f3c-4636-88ca-f158e22015b2-kube-api-access-kt7z5\") pod \"04f0748c-4f3c-4636-88ca-f158e22015b2\" (UID: \"04f0748c-4f3c-4636-88ca-f158e22015b2\") " Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.492047 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04f0748c-4f3c-4636-88ca-f158e22015b2-kube-api-access-kt7z5" (OuterVolumeSpecName: "kube-api-access-kt7z5") pod "04f0748c-4f3c-4636-88ca-f158e22015b2" (UID: "04f0748c-4f3c-4636-88ca-f158e22015b2"). InnerVolumeSpecName "kube-api-access-kt7z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.519131 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04f0748c-4f3c-4636-88ca-f158e22015b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04f0748c-4f3c-4636-88ca-f158e22015b2" (UID: "04f0748c-4f3c-4636-88ca-f158e22015b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.520878 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04f0748c-4f3c-4636-88ca-f158e22015b2-config-data" (OuterVolumeSpecName: "config-data") pod "04f0748c-4f3c-4636-88ca-f158e22015b2" (UID: "04f0748c-4f3c-4636-88ca-f158e22015b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.583817 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt7z5\" (UniqueName: \"kubernetes.io/projected/04f0748c-4f3c-4636-88ca-f158e22015b2-kube-api-access-kt7z5\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.583845 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04f0748c-4f3c-4636-88ca-f158e22015b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.583856 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04f0748c-4f3c-4636-88ca-f158e22015b2-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.625782 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.684943 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dfc89f0-28a3-482f-8e8a-80d23e14c53a-combined-ca-bundle\") pod \"3dfc89f0-28a3-482f-8e8a-80d23e14c53a\" (UID: \"3dfc89f0-28a3-482f-8e8a-80d23e14c53a\") " Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.685049 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqdmj\" (UniqueName: \"kubernetes.io/projected/3dfc89f0-28a3-482f-8e8a-80d23e14c53a-kube-api-access-fqdmj\") pod \"3dfc89f0-28a3-482f-8e8a-80d23e14c53a\" (UID: \"3dfc89f0-28a3-482f-8e8a-80d23e14c53a\") " Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.685151 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dfc89f0-28a3-482f-8e8a-80d23e14c53a-logs\") pod \"3dfc89f0-28a3-482f-8e8a-80d23e14c53a\" (UID: \"3dfc89f0-28a3-482f-8e8a-80d23e14c53a\") " Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.685277 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dfc89f0-28a3-482f-8e8a-80d23e14c53a-config-data\") pod \"3dfc89f0-28a3-482f-8e8a-80d23e14c53a\" (UID: \"3dfc89f0-28a3-482f-8e8a-80d23e14c53a\") " Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.690337 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dfc89f0-28a3-482f-8e8a-80d23e14c53a-logs" (OuterVolumeSpecName: "logs") pod "3dfc89f0-28a3-482f-8e8a-80d23e14c53a" (UID: "3dfc89f0-28a3-482f-8e8a-80d23e14c53a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.697291 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dfc89f0-28a3-482f-8e8a-80d23e14c53a-kube-api-access-fqdmj" (OuterVolumeSpecName: "kube-api-access-fqdmj") pod "3dfc89f0-28a3-482f-8e8a-80d23e14c53a" (UID: "3dfc89f0-28a3-482f-8e8a-80d23e14c53a"). InnerVolumeSpecName "kube-api-access-fqdmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.712213 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dfc89f0-28a3-482f-8e8a-80d23e14c53a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3dfc89f0-28a3-482f-8e8a-80d23e14c53a" (UID: "3dfc89f0-28a3-482f-8e8a-80d23e14c53a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.721262 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dfc89f0-28a3-482f-8e8a-80d23e14c53a-config-data" (OuterVolumeSpecName: "config-data") pod "3dfc89f0-28a3-482f-8e8a-80d23e14c53a" (UID: "3dfc89f0-28a3-482f-8e8a-80d23e14c53a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.786796 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dfc89f0-28a3-482f-8e8a-80d23e14c53a-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.786824 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dfc89f0-28a3-482f-8e8a-80d23e14c53a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.786834 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqdmj\" (UniqueName: \"kubernetes.io/projected/3dfc89f0-28a3-482f-8e8a-80d23e14c53a-kube-api-access-fqdmj\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.786844 5036 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dfc89f0-28a3-482f-8e8a-80d23e14c53a-logs\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.906592 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04f0748c-4f3c-4636-88ca-f158e22015b2","Type":"ContainerDied","Data":"1423506609f736729b6a197474014e731984e7d0ae9d79af89e66e4e0e15d814"} Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.907577 5036 scope.go:117] "RemoveContainer" containerID="f0b1454a877cbcbc763fdc0cd07a7831c8b5ebfc2b424f9fb799863628ef47d6" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.906637 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.908524 5036 generic.go:334] "Generic (PLEG): container finished" podID="3dfc89f0-28a3-482f-8e8a-80d23e14c53a" containerID="fe2924cdc97f368f7a4a4d0b1260805c4b83a480f39f57798b5799b9fc70a499" exitCode=0 Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.908763 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.908759 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3dfc89f0-28a3-482f-8e8a-80d23e14c53a","Type":"ContainerDied","Data":"fe2924cdc97f368f7a4a4d0b1260805c4b83a480f39f57798b5799b9fc70a499"} Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.908869 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3dfc89f0-28a3-482f-8e8a-80d23e14c53a","Type":"ContainerDied","Data":"ecd587469539a09031cea2c8d1e793cb56f798f9a2e503b6127bf8ce168f4490"} Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.934773 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.947366 5036 scope.go:117] "RemoveContainer" containerID="fe2924cdc97f368f7a4a4d0b1260805c4b83a480f39f57798b5799b9fc70a499" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.948739 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.963731 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.978637 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.989011 5036 scope.go:117] "RemoveContainer" containerID="edd5c154470e69955a20788b278cb256aa5c00b3ffc4be2cd299a8a7f9a489dc" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.997565 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 10 16:46:08 crc kubenswrapper[5036]: E0110 16:46:08.998265 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dfc89f0-28a3-482f-8e8a-80d23e14c53a" containerName="nova-api-api" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.998381 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dfc89f0-28a3-482f-8e8a-80d23e14c53a" containerName="nova-api-api" Jan 10 16:46:08 crc kubenswrapper[5036]: E0110 16:46:08.998459 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04f0748c-4f3c-4636-88ca-f158e22015b2" containerName="nova-scheduler-scheduler" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.998520 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="04f0748c-4f3c-4636-88ca-f158e22015b2" containerName="nova-scheduler-scheduler" Jan 10 16:46:08 crc kubenswrapper[5036]: E0110 16:46:08.999533 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dfc89f0-28a3-482f-8e8a-80d23e14c53a" containerName="nova-api-log" Jan 10 16:46:08 crc kubenswrapper[5036]: I0110 16:46:08.999645 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dfc89f0-28a3-482f-8e8a-80d23e14c53a" containerName="nova-api-log" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.000037 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dfc89f0-28a3-482f-8e8a-80d23e14c53a" containerName="nova-api-api" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.000140 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dfc89f0-28a3-482f-8e8a-80d23e14c53a" containerName="nova-api-log" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.000219 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="04f0748c-4f3c-4636-88ca-f158e22015b2" containerName="nova-scheduler-scheduler" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.001138 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.003607 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.010087 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.022058 5036 scope.go:117] "RemoveContainer" containerID="fe2924cdc97f368f7a4a4d0b1260805c4b83a480f39f57798b5799b9fc70a499" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.024774 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 10 16:46:09 crc kubenswrapper[5036]: E0110 16:46:09.025869 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe2924cdc97f368f7a4a4d0b1260805c4b83a480f39f57798b5799b9fc70a499\": container with ID starting with fe2924cdc97f368f7a4a4d0b1260805c4b83a480f39f57798b5799b9fc70a499 not found: ID does not exist" containerID="fe2924cdc97f368f7a4a4d0b1260805c4b83a480f39f57798b5799b9fc70a499" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.025915 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe2924cdc97f368f7a4a4d0b1260805c4b83a480f39f57798b5799b9fc70a499"} err="failed to get container status \"fe2924cdc97f368f7a4a4d0b1260805c4b83a480f39f57798b5799b9fc70a499\": rpc error: code = NotFound desc = could not find container \"fe2924cdc97f368f7a4a4d0b1260805c4b83a480f39f57798b5799b9fc70a499\": container with ID starting with fe2924cdc97f368f7a4a4d0b1260805c4b83a480f39f57798b5799b9fc70a499 not found: ID does not exist" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.025941 5036 scope.go:117] "RemoveContainer" containerID="edd5c154470e69955a20788b278cb256aa5c00b3ffc4be2cd299a8a7f9a489dc" Jan 10 16:46:09 crc kubenswrapper[5036]: E0110 16:46:09.026366 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edd5c154470e69955a20788b278cb256aa5c00b3ffc4be2cd299a8a7f9a489dc\": container with ID starting with edd5c154470e69955a20788b278cb256aa5c00b3ffc4be2cd299a8a7f9a489dc not found: ID does not exist" containerID="edd5c154470e69955a20788b278cb256aa5c00b3ffc4be2cd299a8a7f9a489dc" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.026400 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edd5c154470e69955a20788b278cb256aa5c00b3ffc4be2cd299a8a7f9a489dc"} err="failed to get container status \"edd5c154470e69955a20788b278cb256aa5c00b3ffc4be2cd299a8a7f9a489dc\": rpc error: code = NotFound desc = could not find container \"edd5c154470e69955a20788b278cb256aa5c00b3ffc4be2cd299a8a7f9a489dc\": container with ID starting with edd5c154470e69955a20788b278cb256aa5c00b3ffc4be2cd299a8a7f9a489dc not found: ID does not exist" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.028159 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.030645 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.038749 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.097649 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9682c9a0-dc68-464d-bba2-04049b4e2b36-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9682c9a0-dc68-464d-bba2-04049b4e2b36\") " pod="openstack/nova-api-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.097815 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9682c9a0-dc68-464d-bba2-04049b4e2b36-logs\") pod \"nova-api-0\" (UID: \"9682c9a0-dc68-464d-bba2-04049b4e2b36\") " pod="openstack/nova-api-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.097842 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpmzw\" (UniqueName: \"kubernetes.io/projected/9682c9a0-dc68-464d-bba2-04049b4e2b36-kube-api-access-xpmzw\") pod \"nova-api-0\" (UID: \"9682c9a0-dc68-464d-bba2-04049b4e2b36\") " pod="openstack/nova-api-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.097906 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa-config-data\") pod \"nova-scheduler-0\" (UID: \"2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa\") " pod="openstack/nova-scheduler-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.097980 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa\") " pod="openstack/nova-scheduler-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.098002 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqbs7\" (UniqueName: \"kubernetes.io/projected/2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa-kube-api-access-wqbs7\") pod \"nova-scheduler-0\" (UID: \"2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa\") " pod="openstack/nova-scheduler-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.098028 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9682c9a0-dc68-464d-bba2-04049b4e2b36-config-data\") pod \"nova-api-0\" (UID: \"9682c9a0-dc68-464d-bba2-04049b4e2b36\") " pod="openstack/nova-api-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.199869 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9682c9a0-dc68-464d-bba2-04049b4e2b36-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9682c9a0-dc68-464d-bba2-04049b4e2b36\") " pod="openstack/nova-api-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.199967 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9682c9a0-dc68-464d-bba2-04049b4e2b36-logs\") pod \"nova-api-0\" (UID: \"9682c9a0-dc68-464d-bba2-04049b4e2b36\") " pod="openstack/nova-api-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.199992 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpmzw\" (UniqueName: \"kubernetes.io/projected/9682c9a0-dc68-464d-bba2-04049b4e2b36-kube-api-access-xpmzw\") pod \"nova-api-0\" (UID: \"9682c9a0-dc68-464d-bba2-04049b4e2b36\") " pod="openstack/nova-api-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.200024 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa-config-data\") pod \"nova-scheduler-0\" (UID: \"2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa\") " pod="openstack/nova-scheduler-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.200113 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa\") " pod="openstack/nova-scheduler-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.200140 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqbs7\" (UniqueName: \"kubernetes.io/projected/2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa-kube-api-access-wqbs7\") pod \"nova-scheduler-0\" (UID: \"2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa\") " pod="openstack/nova-scheduler-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.200169 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9682c9a0-dc68-464d-bba2-04049b4e2b36-config-data\") pod \"nova-api-0\" (UID: \"9682c9a0-dc68-464d-bba2-04049b4e2b36\") " pod="openstack/nova-api-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.200864 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9682c9a0-dc68-464d-bba2-04049b4e2b36-logs\") pod \"nova-api-0\" (UID: \"9682c9a0-dc68-464d-bba2-04049b4e2b36\") " pod="openstack/nova-api-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.205040 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9682c9a0-dc68-464d-bba2-04049b4e2b36-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9682c9a0-dc68-464d-bba2-04049b4e2b36\") " pod="openstack/nova-api-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.205947 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9682c9a0-dc68-464d-bba2-04049b4e2b36-config-data\") pod \"nova-api-0\" (UID: \"9682c9a0-dc68-464d-bba2-04049b4e2b36\") " pod="openstack/nova-api-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.205966 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa-config-data\") pod \"nova-scheduler-0\" (UID: \"2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa\") " pod="openstack/nova-scheduler-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.206294 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa\") " pod="openstack/nova-scheduler-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.216972 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpmzw\" (UniqueName: \"kubernetes.io/projected/9682c9a0-dc68-464d-bba2-04049b4e2b36-kube-api-access-xpmzw\") pod \"nova-api-0\" (UID: \"9682c9a0-dc68-464d-bba2-04049b4e2b36\") " pod="openstack/nova-api-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.217874 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqbs7\" (UniqueName: \"kubernetes.io/projected/2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa-kube-api-access-wqbs7\") pod \"nova-scheduler-0\" (UID: \"2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa\") " pod="openstack/nova-scheduler-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.320886 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.360441 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.478001 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.478364 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.798173 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.907633 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.917146 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9682c9a0-dc68-464d-bba2-04049b4e2b36","Type":"ContainerStarted","Data":"fd0c0020c235d390c5ca73901182bab59d710a564768200dff18f6bfa6b2ae1d"} Jan 10 16:46:09 crc kubenswrapper[5036]: I0110 16:46:09.920368 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa","Type":"ContainerStarted","Data":"dad126616a282e50d8bf0fe27890e274e2dddbccfd6997eae31d78fe1b4e8072"} Jan 10 16:46:10 crc kubenswrapper[5036]: I0110 16:46:10.525409 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04f0748c-4f3c-4636-88ca-f158e22015b2" path="/var/lib/kubelet/pods/04f0748c-4f3c-4636-88ca-f158e22015b2/volumes" Jan 10 16:46:10 crc kubenswrapper[5036]: I0110 16:46:10.526326 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dfc89f0-28a3-482f-8e8a-80d23e14c53a" path="/var/lib/kubelet/pods/3dfc89f0-28a3-482f-8e8a-80d23e14c53a/volumes" Jan 10 16:46:10 crc kubenswrapper[5036]: I0110 16:46:10.934558 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa","Type":"ContainerStarted","Data":"2419bc8ef43251b9a30022bd39e79403103ca47db5e1838f25d20ebc5af7b0ec"} Jan 10 16:46:10 crc kubenswrapper[5036]: I0110 16:46:10.941107 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9682c9a0-dc68-464d-bba2-04049b4e2b36","Type":"ContainerStarted","Data":"4600232eb985e44761c5655631545f78f5ef776d1218d905e85ebe84886369b6"} Jan 10 16:46:10 crc kubenswrapper[5036]: I0110 16:46:10.941148 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9682c9a0-dc68-464d-bba2-04049b4e2b36","Type":"ContainerStarted","Data":"ac41b658d2114ae96d897eabc0748e6b610703b2b3252d2b32f4c5758bacbc02"} Jan 10 16:46:10 crc kubenswrapper[5036]: I0110 16:46:10.958912 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.9588966709999998 podStartE2EDuration="2.958896671s" podCreationTimestamp="2026-01-10 16:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:46:10.95604955 +0000 UTC m=+1092.826285054" watchObservedRunningTime="2026-01-10 16:46:10.958896671 +0000 UTC m=+1092.829132165" Jan 10 16:46:10 crc kubenswrapper[5036]: I0110 16:46:10.986137 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.986122889 podStartE2EDuration="2.986122889s" podCreationTimestamp="2026-01-10 16:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:46:10.980497258 +0000 UTC m=+1092.850732792" watchObservedRunningTime="2026-01-10 16:46:10.986122889 +0000 UTC m=+1092.856358383" Jan 10 16:46:11 crc kubenswrapper[5036]: I0110 16:46:11.362141 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 10 16:46:11 crc kubenswrapper[5036]: I0110 16:46:11.810799 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 10 16:46:14 crc kubenswrapper[5036]: I0110 16:46:14.321451 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 10 16:46:14 crc kubenswrapper[5036]: I0110 16:46:14.477957 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 10 16:46:14 crc kubenswrapper[5036]: I0110 16:46:14.478277 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 10 16:46:15 crc kubenswrapper[5036]: I0110 16:46:15.496850 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8ebf6309-92e5-4223-93b6-93138eb0c7e5" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.178:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 10 16:46:15 crc kubenswrapper[5036]: I0110 16:46:15.496905 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8ebf6309-92e5-4223-93b6-93138eb0c7e5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.178:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 10 16:46:19 crc kubenswrapper[5036]: I0110 16:46:19.321514 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 10 16:46:19 crc kubenswrapper[5036]: I0110 16:46:19.349340 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 10 16:46:19 crc kubenswrapper[5036]: I0110 16:46:19.360748 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 10 16:46:19 crc kubenswrapper[5036]: I0110 16:46:19.360820 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 10 16:46:20 crc kubenswrapper[5036]: I0110 16:46:20.065151 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 10 16:46:20 crc kubenswrapper[5036]: I0110 16:46:20.444336 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9682c9a0-dc68-464d-bba2-04049b4e2b36" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 10 16:46:20 crc kubenswrapper[5036]: I0110 16:46:20.451770 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9682c9a0-dc68-464d-bba2-04049b4e2b36" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 10 16:46:24 crc kubenswrapper[5036]: I0110 16:46:24.483223 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 10 16:46:24 crc kubenswrapper[5036]: I0110 16:46:24.487016 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 10 16:46:24 crc kubenswrapper[5036]: I0110 16:46:24.491796 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 10 16:46:25 crc kubenswrapper[5036]: I0110 16:46:25.086642 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 10 16:46:26 crc kubenswrapper[5036]: W0110 16:46:26.721083 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb875d5f_5f8c_43f9_964e_a805a9132aa3.slice/crio-b9f2a3f380b845fcf98dbed772f570d79c13930305242b8715760e59597d3dee WatchSource:0}: Error finding container b9f2a3f380b845fcf98dbed772f570d79c13930305242b8715760e59597d3dee: Status 404 returned error can't find the container with id b9f2a3f380b845fcf98dbed772f570d79c13930305242b8715760e59597d3dee Jan 10 16:46:26 crc kubenswrapper[5036]: W0110 16:46:26.724150 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb875d5f_5f8c_43f9_964e_a805a9132aa3.slice/crio-40e9b1f6d05a7dec1e4f536e0ef4873a3fcfc2801a2ba529aedb6f62283aedc7.scope WatchSource:0}: Error finding container 40e9b1f6d05a7dec1e4f536e0ef4873a3fcfc2801a2ba529aedb6f62283aedc7: Status 404 returned error can't find the container with id 40e9b1f6d05a7dec1e4f536e0ef4873a3fcfc2801a2ba529aedb6f62283aedc7 Jan 10 16:46:26 crc kubenswrapper[5036]: W0110 16:46:26.724462 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb875d5f_5f8c_43f9_964e_a805a9132aa3.slice/crio-2feafa24de14a1a71d38940e293342b971a3f1cd49c006d99fcdff09b7950e46.scope WatchSource:0}: Error finding container 2feafa24de14a1a71d38940e293342b971a3f1cd49c006d99fcdff09b7950e46: Status 404 returned error can't find the container with id 2feafa24de14a1a71d38940e293342b971a3f1cd49c006d99fcdff09b7950e46 Jan 10 16:46:26 crc kubenswrapper[5036]: E0110 16:46:26.724751 5036 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04f0748c_4f3c_4636_88ca_f158e22015b2.slice/crio-1423506609f736729b6a197474014e731984e7d0ae9d79af89e66e4e0e15d814: Error finding container 1423506609f736729b6a197474014e731984e7d0ae9d79af89e66e4e0e15d814: Status 404 returned error can't find the container with id 1423506609f736729b6a197474014e731984e7d0ae9d79af89e66e4e0e15d814 Jan 10 16:46:26 crc kubenswrapper[5036]: E0110 16:46:26.730515 5036 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dfc89f0_28a3_482f_8e8a_80d23e14c53a.slice/crio-ecd587469539a09031cea2c8d1e793cb56f798f9a2e503b6127bf8ce168f4490: Error finding container ecd587469539a09031cea2c8d1e793cb56f798f9a2e503b6127bf8ce168f4490: Status 404 returned error can't find the container with id ecd587469539a09031cea2c8d1e793cb56f798f9a2e503b6127bf8ce168f4490 Jan 10 16:46:26 crc kubenswrapper[5036]: E0110 16:46:26.956477 5036 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod612fc345_2e6f_43c5_bfe1_e605b58e1159.slice/crio-062f4cc5c2f78ab1962c592ee9b9dd067adf2b9f1d9c840c347c47ba459c70a3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod612fc345_2e6f_43c5_bfe1_e605b58e1159.slice/crio-conmon-062f4cc5c2f78ab1962c592ee9b9dd067adf2b9f1d9c840c347c47ba459c70a3.scope\": RecentStats: unable to find data in memory cache]" Jan 10 16:46:27 crc kubenswrapper[5036]: I0110 16:46:27.096791 5036 generic.go:334] "Generic (PLEG): container finished" podID="612fc345-2e6f-43c5-bfe1-e605b58e1159" containerID="062f4cc5c2f78ab1962c592ee9b9dd067adf2b9f1d9c840c347c47ba459c70a3" exitCode=137 Jan 10 16:46:27 crc kubenswrapper[5036]: I0110 16:46:27.096853 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"612fc345-2e6f-43c5-bfe1-e605b58e1159","Type":"ContainerDied","Data":"062f4cc5c2f78ab1962c592ee9b9dd067adf2b9f1d9c840c347c47ba459c70a3"} Jan 10 16:46:27 crc kubenswrapper[5036]: I0110 16:46:27.097228 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"612fc345-2e6f-43c5-bfe1-e605b58e1159","Type":"ContainerDied","Data":"020ec95e3e1c00acc34f2809cdb4cb985b639ec08e9d7076eb6c63337631d9f1"} Jan 10 16:46:27 crc kubenswrapper[5036]: I0110 16:46:27.097241 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="020ec95e3e1c00acc34f2809cdb4cb985b639ec08e9d7076eb6c63337631d9f1" Jan 10 16:46:27 crc kubenswrapper[5036]: I0110 16:46:27.097608 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:46:27 crc kubenswrapper[5036]: I0110 16:46:27.267913 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/612fc345-2e6f-43c5-bfe1-e605b58e1159-combined-ca-bundle\") pod \"612fc345-2e6f-43c5-bfe1-e605b58e1159\" (UID: \"612fc345-2e6f-43c5-bfe1-e605b58e1159\") " Jan 10 16:46:27 crc kubenswrapper[5036]: I0110 16:46:27.267974 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/612fc345-2e6f-43c5-bfe1-e605b58e1159-config-data\") pod \"612fc345-2e6f-43c5-bfe1-e605b58e1159\" (UID: \"612fc345-2e6f-43c5-bfe1-e605b58e1159\") " Jan 10 16:46:27 crc kubenswrapper[5036]: I0110 16:46:27.268052 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zctqr\" (UniqueName: \"kubernetes.io/projected/612fc345-2e6f-43c5-bfe1-e605b58e1159-kube-api-access-zctqr\") pod \"612fc345-2e6f-43c5-bfe1-e605b58e1159\" (UID: \"612fc345-2e6f-43c5-bfe1-e605b58e1159\") " Jan 10 16:46:27 crc kubenswrapper[5036]: I0110 16:46:27.285223 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/612fc345-2e6f-43c5-bfe1-e605b58e1159-kube-api-access-zctqr" (OuterVolumeSpecName: "kube-api-access-zctqr") pod "612fc345-2e6f-43c5-bfe1-e605b58e1159" (UID: "612fc345-2e6f-43c5-bfe1-e605b58e1159"). InnerVolumeSpecName "kube-api-access-zctqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:46:27 crc kubenswrapper[5036]: I0110 16:46:27.313321 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/612fc345-2e6f-43c5-bfe1-e605b58e1159-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "612fc345-2e6f-43c5-bfe1-e605b58e1159" (UID: "612fc345-2e6f-43c5-bfe1-e605b58e1159"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:27 crc kubenswrapper[5036]: I0110 16:46:27.318225 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/612fc345-2e6f-43c5-bfe1-e605b58e1159-config-data" (OuterVolumeSpecName: "config-data") pod "612fc345-2e6f-43c5-bfe1-e605b58e1159" (UID: "612fc345-2e6f-43c5-bfe1-e605b58e1159"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:27 crc kubenswrapper[5036]: I0110 16:46:27.371153 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/612fc345-2e6f-43c5-bfe1-e605b58e1159-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:27 crc kubenswrapper[5036]: I0110 16:46:27.371188 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/612fc345-2e6f-43c5-bfe1-e605b58e1159-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:27 crc kubenswrapper[5036]: I0110 16:46:27.371201 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zctqr\" (UniqueName: \"kubernetes.io/projected/612fc345-2e6f-43c5-bfe1-e605b58e1159-kube-api-access-zctqr\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.106955 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.139813 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.150630 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.159858 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 10 16:46:28 crc kubenswrapper[5036]: E0110 16:46:28.160294 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="612fc345-2e6f-43c5-bfe1-e605b58e1159" containerName="nova-cell1-novncproxy-novncproxy" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.160313 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="612fc345-2e6f-43c5-bfe1-e605b58e1159" containerName="nova-cell1-novncproxy-novncproxy" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.160526 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="612fc345-2e6f-43c5-bfe1-e605b58e1159" containerName="nova-cell1-novncproxy-novncproxy" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.161262 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.163528 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.163629 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.164085 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.179109 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.285472 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c0ed0eb-87d3-43cc-bdbb-1269890e7799-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c0ed0eb-87d3-43cc-bdbb-1269890e7799\") " pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.285517 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c0ed0eb-87d3-43cc-bdbb-1269890e7799-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c0ed0eb-87d3-43cc-bdbb-1269890e7799\") " pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.285581 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c0ed0eb-87d3-43cc-bdbb-1269890e7799-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c0ed0eb-87d3-43cc-bdbb-1269890e7799\") " pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.285602 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hzcj\" (UniqueName: \"kubernetes.io/projected/8c0ed0eb-87d3-43cc-bdbb-1269890e7799-kube-api-access-6hzcj\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c0ed0eb-87d3-43cc-bdbb-1269890e7799\") " pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.286026 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c0ed0eb-87d3-43cc-bdbb-1269890e7799-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c0ed0eb-87d3-43cc-bdbb-1269890e7799\") " pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.389699 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c0ed0eb-87d3-43cc-bdbb-1269890e7799-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c0ed0eb-87d3-43cc-bdbb-1269890e7799\") " pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.389746 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hzcj\" (UniqueName: \"kubernetes.io/projected/8c0ed0eb-87d3-43cc-bdbb-1269890e7799-kube-api-access-6hzcj\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c0ed0eb-87d3-43cc-bdbb-1269890e7799\") " pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.389809 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c0ed0eb-87d3-43cc-bdbb-1269890e7799-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c0ed0eb-87d3-43cc-bdbb-1269890e7799\") " pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.389860 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c0ed0eb-87d3-43cc-bdbb-1269890e7799-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c0ed0eb-87d3-43cc-bdbb-1269890e7799\") " pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.389907 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c0ed0eb-87d3-43cc-bdbb-1269890e7799-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c0ed0eb-87d3-43cc-bdbb-1269890e7799\") " pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.394173 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c0ed0eb-87d3-43cc-bdbb-1269890e7799-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c0ed0eb-87d3-43cc-bdbb-1269890e7799\") " pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.394279 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c0ed0eb-87d3-43cc-bdbb-1269890e7799-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c0ed0eb-87d3-43cc-bdbb-1269890e7799\") " pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.394789 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c0ed0eb-87d3-43cc-bdbb-1269890e7799-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c0ed0eb-87d3-43cc-bdbb-1269890e7799\") " pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.406293 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c0ed0eb-87d3-43cc-bdbb-1269890e7799-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c0ed0eb-87d3-43cc-bdbb-1269890e7799\") " pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.414930 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hzcj\" (UniqueName: \"kubernetes.io/projected/8c0ed0eb-87d3-43cc-bdbb-1269890e7799-kube-api-access-6hzcj\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c0ed0eb-87d3-43cc-bdbb-1269890e7799\") " pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.494279 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.517552 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="612fc345-2e6f-43c5-bfe1-e605b58e1159" path="/var/lib/kubelet/pods/612fc345-2e6f-43c5-bfe1-e605b58e1159/volumes" Jan 10 16:46:28 crc kubenswrapper[5036]: I0110 16:46:28.970393 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 10 16:46:28 crc kubenswrapper[5036]: W0110 16:46:28.975016 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c0ed0eb_87d3_43cc_bdbb_1269890e7799.slice/crio-07ef7a3565bfe26570836d81d5179e90255318d80f38d63cfd0111a34d00f0cd WatchSource:0}: Error finding container 07ef7a3565bfe26570836d81d5179e90255318d80f38d63cfd0111a34d00f0cd: Status 404 returned error can't find the container with id 07ef7a3565bfe26570836d81d5179e90255318d80f38d63cfd0111a34d00f0cd Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.116756 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8c0ed0eb-87d3-43cc-bdbb-1269890e7799","Type":"ContainerStarted","Data":"07ef7a3565bfe26570836d81d5179e90255318d80f38d63cfd0111a34d00f0cd"} Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.364951 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.365333 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.365586 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.365639 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.369087 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.383009 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.555273 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-8mt6q"] Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.558776 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.580207 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-8mt6q"] Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.639726 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d33c7366-b7b9-41d6-89ca-c71bc7561466-config\") pod \"dnsmasq-dns-5b856c5697-8mt6q\" (UID: \"d33c7366-b7b9-41d6-89ca-c71bc7561466\") " pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.639787 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff89t\" (UniqueName: \"kubernetes.io/projected/d33c7366-b7b9-41d6-89ca-c71bc7561466-kube-api-access-ff89t\") pod \"dnsmasq-dns-5b856c5697-8mt6q\" (UID: \"d33c7366-b7b9-41d6-89ca-c71bc7561466\") " pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.639872 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d33c7366-b7b9-41d6-89ca-c71bc7561466-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-8mt6q\" (UID: \"d33c7366-b7b9-41d6-89ca-c71bc7561466\") " pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.640056 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d33c7366-b7b9-41d6-89ca-c71bc7561466-dns-svc\") pod \"dnsmasq-dns-5b856c5697-8mt6q\" (UID: \"d33c7366-b7b9-41d6-89ca-c71bc7561466\") " pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.640242 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d33c7366-b7b9-41d6-89ca-c71bc7561466-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-8mt6q\" (UID: \"d33c7366-b7b9-41d6-89ca-c71bc7561466\") " pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.741219 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d33c7366-b7b9-41d6-89ca-c71bc7561466-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-8mt6q\" (UID: \"d33c7366-b7b9-41d6-89ca-c71bc7561466\") " pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.741306 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d33c7366-b7b9-41d6-89ca-c71bc7561466-config\") pod \"dnsmasq-dns-5b856c5697-8mt6q\" (UID: \"d33c7366-b7b9-41d6-89ca-c71bc7561466\") " pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.742293 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d33c7366-b7b9-41d6-89ca-c71bc7561466-config\") pod \"dnsmasq-dns-5b856c5697-8mt6q\" (UID: \"d33c7366-b7b9-41d6-89ca-c71bc7561466\") " pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.741335 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff89t\" (UniqueName: \"kubernetes.io/projected/d33c7366-b7b9-41d6-89ca-c71bc7561466-kube-api-access-ff89t\") pod \"dnsmasq-dns-5b856c5697-8mt6q\" (UID: \"d33c7366-b7b9-41d6-89ca-c71bc7561466\") " pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.742435 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d33c7366-b7b9-41d6-89ca-c71bc7561466-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-8mt6q\" (UID: \"d33c7366-b7b9-41d6-89ca-c71bc7561466\") " pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.742326 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d33c7366-b7b9-41d6-89ca-c71bc7561466-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-8mt6q\" (UID: \"d33c7366-b7b9-41d6-89ca-c71bc7561466\") " pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.742605 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d33c7366-b7b9-41d6-89ca-c71bc7561466-dns-svc\") pod \"dnsmasq-dns-5b856c5697-8mt6q\" (UID: \"d33c7366-b7b9-41d6-89ca-c71bc7561466\") " pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.743215 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d33c7366-b7b9-41d6-89ca-c71bc7561466-dns-svc\") pod \"dnsmasq-dns-5b856c5697-8mt6q\" (UID: \"d33c7366-b7b9-41d6-89ca-c71bc7561466\") " pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.743666 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d33c7366-b7b9-41d6-89ca-c71bc7561466-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-8mt6q\" (UID: \"d33c7366-b7b9-41d6-89ca-c71bc7561466\") " pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.766536 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff89t\" (UniqueName: \"kubernetes.io/projected/d33c7366-b7b9-41d6-89ca-c71bc7561466-kube-api-access-ff89t\") pod \"dnsmasq-dns-5b856c5697-8mt6q\" (UID: \"d33c7366-b7b9-41d6-89ca-c71bc7561466\") " pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" Jan 10 16:46:29 crc kubenswrapper[5036]: I0110 16:46:29.885797 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" Jan 10 16:46:30 crc kubenswrapper[5036]: I0110 16:46:30.126799 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8c0ed0eb-87d3-43cc-bdbb-1269890e7799","Type":"ContainerStarted","Data":"23494f8890e93055634a2029d7c1d7fa6754c5ec165d1e2b4a3166ca29084d77"} Jan 10 16:46:30 crc kubenswrapper[5036]: I0110 16:46:30.150320 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.150304522 podStartE2EDuration="2.150304522s" podCreationTimestamp="2026-01-10 16:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:46:30.149413237 +0000 UTC m=+1112.019648721" watchObservedRunningTime="2026-01-10 16:46:30.150304522 +0000 UTC m=+1112.020540016" Jan 10 16:46:30 crc kubenswrapper[5036]: I0110 16:46:30.423301 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-8mt6q"] Jan 10 16:46:31 crc kubenswrapper[5036]: I0110 16:46:31.136750 5036 generic.go:334] "Generic (PLEG): container finished" podID="d33c7366-b7b9-41d6-89ca-c71bc7561466" containerID="1ad2febe5c8f68ea344c14f9ea156485f20aa6081caee872dffcba4d0c8316b2" exitCode=0 Jan 10 16:46:31 crc kubenswrapper[5036]: I0110 16:46:31.136948 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" event={"ID":"d33c7366-b7b9-41d6-89ca-c71bc7561466","Type":"ContainerDied","Data":"1ad2febe5c8f68ea344c14f9ea156485f20aa6081caee872dffcba4d0c8316b2"} Jan 10 16:46:31 crc kubenswrapper[5036]: I0110 16:46:31.137801 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" event={"ID":"d33c7366-b7b9-41d6-89ca-c71bc7561466","Type":"ContainerStarted","Data":"15bedd485bae645f6f7650fe6c0daf0c12cbbbb099ed29a21042e96c72d70056"} Jan 10 16:46:31 crc kubenswrapper[5036]: I0110 16:46:31.895270 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 10 16:46:32 crc kubenswrapper[5036]: I0110 16:46:32.115054 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:46:32 crc kubenswrapper[5036]: I0110 16:46:32.115319 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" containerName="ceilometer-central-agent" containerID="cri-o://58d0d3001d503c3b44d42727773790d14733ce6cc7cc20b1773f78ec18742634" gracePeriod=30 Jan 10 16:46:32 crc kubenswrapper[5036]: I0110 16:46:32.115426 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" containerName="ceilometer-notification-agent" containerID="cri-o://63b13ef439ca6c62bf2244c701b81a82dfaf0673f9e6fb779cab6e45955aefc5" gracePeriod=30 Jan 10 16:46:32 crc kubenswrapper[5036]: I0110 16:46:32.115428 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" containerName="proxy-httpd" containerID="cri-o://5a0380421d3f6630d8ecd24205055565cedff466560216642627b3358082472e" gracePeriod=30 Jan 10 16:46:32 crc kubenswrapper[5036]: I0110 16:46:32.115633 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" containerName="sg-core" containerID="cri-o://ab6be45333fa9739aaae8dace5f8632f42666c9399daa665f72448f2ddde31de" gracePeriod=30 Jan 10 16:46:32 crc kubenswrapper[5036]: I0110 16:46:32.162922 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" event={"ID":"d33c7366-b7b9-41d6-89ca-c71bc7561466","Type":"ContainerStarted","Data":"cd9e7a9f8b808facfb30e722559122ab795020d0b0d139cea91114c4c688d58c"} Jan 10 16:46:32 crc kubenswrapper[5036]: I0110 16:46:32.163096 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9682c9a0-dc68-464d-bba2-04049b4e2b36" containerName="nova-api-log" containerID="cri-o://ac41b658d2114ae96d897eabc0748e6b610703b2b3252d2b32f4c5758bacbc02" gracePeriod=30 Jan 10 16:46:32 crc kubenswrapper[5036]: I0110 16:46:32.163523 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9682c9a0-dc68-464d-bba2-04049b4e2b36" containerName="nova-api-api" containerID="cri-o://4600232eb985e44761c5655631545f78f5ef776d1218d905e85ebe84886369b6" gracePeriod=30 Jan 10 16:46:32 crc kubenswrapper[5036]: I0110 16:46:32.188734 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" podStartSLOduration=3.18871994 podStartE2EDuration="3.18871994s" podCreationTimestamp="2026-01-10 16:46:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:46:32.185483518 +0000 UTC m=+1114.055719012" watchObservedRunningTime="2026-01-10 16:46:32.18871994 +0000 UTC m=+1114.058955434" Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.174964 5036 generic.go:334] "Generic (PLEG): container finished" podID="f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" containerID="5a0380421d3f6630d8ecd24205055565cedff466560216642627b3358082472e" exitCode=0 Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.175312 5036 generic.go:334] "Generic (PLEG): container finished" podID="f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" containerID="ab6be45333fa9739aaae8dace5f8632f42666c9399daa665f72448f2ddde31de" exitCode=2 Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.175030 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c","Type":"ContainerDied","Data":"5a0380421d3f6630d8ecd24205055565cedff466560216642627b3358082472e"} Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.175369 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c","Type":"ContainerDied","Data":"ab6be45333fa9739aaae8dace5f8632f42666c9399daa665f72448f2ddde31de"} Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.175384 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c","Type":"ContainerDied","Data":"63b13ef439ca6c62bf2244c701b81a82dfaf0673f9e6fb779cab6e45955aefc5"} Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.175329 5036 generic.go:334] "Generic (PLEG): container finished" podID="f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" containerID="63b13ef439ca6c62bf2244c701b81a82dfaf0673f9e6fb779cab6e45955aefc5" exitCode=0 Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.175411 5036 generic.go:334] "Generic (PLEG): container finished" podID="f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" containerID="58d0d3001d503c3b44d42727773790d14733ce6cc7cc20b1773f78ec18742634" exitCode=0 Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.175468 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c","Type":"ContainerDied","Data":"58d0d3001d503c3b44d42727773790d14733ce6cc7cc20b1773f78ec18742634"} Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.177399 5036 generic.go:334] "Generic (PLEG): container finished" podID="9682c9a0-dc68-464d-bba2-04049b4e2b36" containerID="ac41b658d2114ae96d897eabc0748e6b610703b2b3252d2b32f4c5758bacbc02" exitCode=143 Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.177482 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9682c9a0-dc68-464d-bba2-04049b4e2b36","Type":"ContainerDied","Data":"ac41b658d2114ae96d897eabc0748e6b610703b2b3252d2b32f4c5758bacbc02"} Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.177628 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.180369 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.314257 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-combined-ca-bundle\") pod \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.314328 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-scripts\") pod \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.314407 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-run-httpd\") pod \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.314475 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-log-httpd\") pod \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.314506 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzqd4\" (UniqueName: \"kubernetes.io/projected/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-kube-api-access-jzqd4\") pod \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.314600 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-ceilometer-tls-certs\") pod \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.314633 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-sg-core-conf-yaml\") pod \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.314765 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-config-data\") pod \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\" (UID: \"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c\") " Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.315263 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" (UID: "f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.315480 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" (UID: "f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.315793 5036 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.315809 5036 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.324250 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-scripts" (OuterVolumeSpecName: "scripts") pod "f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" (UID: "f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.324284 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-kube-api-access-jzqd4" (OuterVolumeSpecName: "kube-api-access-jzqd4") pod "f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" (UID: "f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c"). InnerVolumeSpecName "kube-api-access-jzqd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.352152 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" (UID: "f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.409894 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" (UID: "f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.418170 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzqd4\" (UniqueName: \"kubernetes.io/projected/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-kube-api-access-jzqd4\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.418209 5036 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.418225 5036 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.418235 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.423730 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" (UID: "f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.425285 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-config-data" (OuterVolumeSpecName: "config-data") pod "f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" (UID: "f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.494799 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.520088 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:33 crc kubenswrapper[5036]: I0110 16:46:33.520131 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.186588 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.186574 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c","Type":"ContainerDied","Data":"89e73c32069fbf5ce31ac5f27c17d9215b6db3e38a0c4c4eb2d821269249d566"} Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.187162 5036 scope.go:117] "RemoveContainer" containerID="5a0380421d3f6630d8ecd24205055565cedff466560216642627b3358082472e" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.209213 5036 scope.go:117] "RemoveContainer" containerID="ab6be45333fa9739aaae8dace5f8632f42666c9399daa665f72448f2ddde31de" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.236860 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.243875 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.251851 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:46:34 crc kubenswrapper[5036]: E0110 16:46:34.252217 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" containerName="sg-core" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.252233 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" containerName="sg-core" Jan 10 16:46:34 crc kubenswrapper[5036]: E0110 16:46:34.252245 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" containerName="ceilometer-notification-agent" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.252254 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" containerName="ceilometer-notification-agent" Jan 10 16:46:34 crc kubenswrapper[5036]: E0110 16:46:34.252263 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" containerName="ceilometer-central-agent" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.252269 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" containerName="ceilometer-central-agent" Jan 10 16:46:34 crc kubenswrapper[5036]: E0110 16:46:34.252297 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" containerName="proxy-httpd" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.252302 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" containerName="proxy-httpd" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.252493 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" containerName="proxy-httpd" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.252507 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" containerName="sg-core" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.252518 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" containerName="ceilometer-notification-agent" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.252532 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" containerName="ceilometer-central-agent" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.254870 5036 scope.go:117] "RemoveContainer" containerID="63b13ef439ca6c62bf2244c701b81a82dfaf0673f9e6fb779cab6e45955aefc5" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.279325 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.281619 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.282967 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.283089 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.283357 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.307672 5036 scope.go:117] "RemoveContainer" containerID="58d0d3001d503c3b44d42727773790d14733ce6cc7cc20b1773f78ec18742634" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.332917 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.332960 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-run-httpd\") pod \"ceilometer-0\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.332983 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.333010 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-scripts\") pod \"ceilometer-0\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.333063 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-config-data\") pod \"ceilometer-0\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.333087 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-log-httpd\") pod \"ceilometer-0\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.333107 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxh7l\" (UniqueName: \"kubernetes.io/projected/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-kube-api-access-cxh7l\") pod \"ceilometer-0\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.333131 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.434450 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-config-data\") pod \"ceilometer-0\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.434786 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-log-httpd\") pod \"ceilometer-0\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.434929 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxh7l\" (UniqueName: \"kubernetes.io/projected/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-kube-api-access-cxh7l\") pod \"ceilometer-0\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.435060 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.435190 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.435298 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-run-httpd\") pod \"ceilometer-0\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.435391 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.435521 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-scripts\") pod \"ceilometer-0\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.435831 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-log-httpd\") pod \"ceilometer-0\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.436117 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-run-httpd\") pod \"ceilometer-0\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.441252 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-scripts\") pod \"ceilometer-0\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.443069 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.443887 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.445220 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-config-data\") pod \"ceilometer-0\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.453122 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.460035 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxh7l\" (UniqueName: \"kubernetes.io/projected/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-kube-api-access-cxh7l\") pod \"ceilometer-0\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " pod="openstack/ceilometer-0" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.520031 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c" path="/var/lib/kubelet/pods/f25a6fbd-9d05-4ccd-bcf0-0e2569e5210c/volumes" Jan 10 16:46:34 crc kubenswrapper[5036]: I0110 16:46:34.608981 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 16:46:35 crc kubenswrapper[5036]: I0110 16:46:35.107119 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 10 16:46:35 crc kubenswrapper[5036]: I0110 16:46:35.199071 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7","Type":"ContainerStarted","Data":"116920f4f6ce959c10e8564ac2cbd93c94f828e95233e6c67247a14ec0a54b3a"} Jan 10 16:46:35 crc kubenswrapper[5036]: I0110 16:46:35.737494 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 10 16:46:35 crc kubenswrapper[5036]: I0110 16:46:35.856905 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9682c9a0-dc68-464d-bba2-04049b4e2b36-config-data\") pod \"9682c9a0-dc68-464d-bba2-04049b4e2b36\" (UID: \"9682c9a0-dc68-464d-bba2-04049b4e2b36\") " Jan 10 16:46:35 crc kubenswrapper[5036]: I0110 16:46:35.857061 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9682c9a0-dc68-464d-bba2-04049b4e2b36-combined-ca-bundle\") pod \"9682c9a0-dc68-464d-bba2-04049b4e2b36\" (UID: \"9682c9a0-dc68-464d-bba2-04049b4e2b36\") " Jan 10 16:46:35 crc kubenswrapper[5036]: I0110 16:46:35.857123 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9682c9a0-dc68-464d-bba2-04049b4e2b36-logs\") pod \"9682c9a0-dc68-464d-bba2-04049b4e2b36\" (UID: \"9682c9a0-dc68-464d-bba2-04049b4e2b36\") " Jan 10 16:46:35 crc kubenswrapper[5036]: I0110 16:46:35.857160 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpmzw\" (UniqueName: \"kubernetes.io/projected/9682c9a0-dc68-464d-bba2-04049b4e2b36-kube-api-access-xpmzw\") pod \"9682c9a0-dc68-464d-bba2-04049b4e2b36\" (UID: \"9682c9a0-dc68-464d-bba2-04049b4e2b36\") " Jan 10 16:46:35 crc kubenswrapper[5036]: I0110 16:46:35.858341 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9682c9a0-dc68-464d-bba2-04049b4e2b36-logs" (OuterVolumeSpecName: "logs") pod "9682c9a0-dc68-464d-bba2-04049b4e2b36" (UID: "9682c9a0-dc68-464d-bba2-04049b4e2b36"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:46:35 crc kubenswrapper[5036]: I0110 16:46:35.863490 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9682c9a0-dc68-464d-bba2-04049b4e2b36-kube-api-access-xpmzw" (OuterVolumeSpecName: "kube-api-access-xpmzw") pod "9682c9a0-dc68-464d-bba2-04049b4e2b36" (UID: "9682c9a0-dc68-464d-bba2-04049b4e2b36"). InnerVolumeSpecName "kube-api-access-xpmzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:46:35 crc kubenswrapper[5036]: I0110 16:46:35.900278 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9682c9a0-dc68-464d-bba2-04049b4e2b36-config-data" (OuterVolumeSpecName: "config-data") pod "9682c9a0-dc68-464d-bba2-04049b4e2b36" (UID: "9682c9a0-dc68-464d-bba2-04049b4e2b36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:35 crc kubenswrapper[5036]: I0110 16:46:35.904075 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9682c9a0-dc68-464d-bba2-04049b4e2b36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9682c9a0-dc68-464d-bba2-04049b4e2b36" (UID: "9682c9a0-dc68-464d-bba2-04049b4e2b36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:35 crc kubenswrapper[5036]: I0110 16:46:35.959700 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpmzw\" (UniqueName: \"kubernetes.io/projected/9682c9a0-dc68-464d-bba2-04049b4e2b36-kube-api-access-xpmzw\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:35 crc kubenswrapper[5036]: I0110 16:46:35.959930 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9682c9a0-dc68-464d-bba2-04049b4e2b36-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:35 crc kubenswrapper[5036]: I0110 16:46:35.959990 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9682c9a0-dc68-464d-bba2-04049b4e2b36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:35 crc kubenswrapper[5036]: I0110 16:46:35.960045 5036 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9682c9a0-dc68-464d-bba2-04049b4e2b36-logs\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.209169 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7","Type":"ContainerStarted","Data":"5489c0ae77ae17200383fd972a712757b379a5d38a6f74632ffa477d87e91865"} Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.211420 5036 generic.go:334] "Generic (PLEG): container finished" podID="9682c9a0-dc68-464d-bba2-04049b4e2b36" containerID="4600232eb985e44761c5655631545f78f5ef776d1218d905e85ebe84886369b6" exitCode=0 Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.211476 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9682c9a0-dc68-464d-bba2-04049b4e2b36","Type":"ContainerDied","Data":"4600232eb985e44761c5655631545f78f5ef776d1218d905e85ebe84886369b6"} Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.211507 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9682c9a0-dc68-464d-bba2-04049b4e2b36","Type":"ContainerDied","Data":"fd0c0020c235d390c5ca73901182bab59d710a564768200dff18f6bfa6b2ae1d"} Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.211523 5036 scope.go:117] "RemoveContainer" containerID="4600232eb985e44761c5655631545f78f5ef776d1218d905e85ebe84886369b6" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.211719 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.234617 5036 scope.go:117] "RemoveContainer" containerID="ac41b658d2114ae96d897eabc0748e6b610703b2b3252d2b32f4c5758bacbc02" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.250486 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.255334 5036 scope.go:117] "RemoveContainer" containerID="4600232eb985e44761c5655631545f78f5ef776d1218d905e85ebe84886369b6" Jan 10 16:46:36 crc kubenswrapper[5036]: E0110 16:46:36.256097 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4600232eb985e44761c5655631545f78f5ef776d1218d905e85ebe84886369b6\": container with ID starting with 4600232eb985e44761c5655631545f78f5ef776d1218d905e85ebe84886369b6 not found: ID does not exist" containerID="4600232eb985e44761c5655631545f78f5ef776d1218d905e85ebe84886369b6" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.256129 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4600232eb985e44761c5655631545f78f5ef776d1218d905e85ebe84886369b6"} err="failed to get container status \"4600232eb985e44761c5655631545f78f5ef776d1218d905e85ebe84886369b6\": rpc error: code = NotFound desc = could not find container \"4600232eb985e44761c5655631545f78f5ef776d1218d905e85ebe84886369b6\": container with ID starting with 4600232eb985e44761c5655631545f78f5ef776d1218d905e85ebe84886369b6 not found: ID does not exist" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.256152 5036 scope.go:117] "RemoveContainer" containerID="ac41b658d2114ae96d897eabc0748e6b610703b2b3252d2b32f4c5758bacbc02" Jan 10 16:46:36 crc kubenswrapper[5036]: E0110 16:46:36.257252 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac41b658d2114ae96d897eabc0748e6b610703b2b3252d2b32f4c5758bacbc02\": container with ID starting with ac41b658d2114ae96d897eabc0748e6b610703b2b3252d2b32f4c5758bacbc02 not found: ID does not exist" containerID="ac41b658d2114ae96d897eabc0748e6b610703b2b3252d2b32f4c5758bacbc02" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.257280 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac41b658d2114ae96d897eabc0748e6b610703b2b3252d2b32f4c5758bacbc02"} err="failed to get container status \"ac41b658d2114ae96d897eabc0748e6b610703b2b3252d2b32f4c5758bacbc02\": rpc error: code = NotFound desc = could not find container \"ac41b658d2114ae96d897eabc0748e6b610703b2b3252d2b32f4c5758bacbc02\": container with ID starting with ac41b658d2114ae96d897eabc0748e6b610703b2b3252d2b32f4c5758bacbc02 not found: ID does not exist" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.262726 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.286860 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 10 16:46:36 crc kubenswrapper[5036]: E0110 16:46:36.287367 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9682c9a0-dc68-464d-bba2-04049b4e2b36" containerName="nova-api-log" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.287389 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="9682c9a0-dc68-464d-bba2-04049b4e2b36" containerName="nova-api-log" Jan 10 16:46:36 crc kubenswrapper[5036]: E0110 16:46:36.287418 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9682c9a0-dc68-464d-bba2-04049b4e2b36" containerName="nova-api-api" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.287429 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="9682c9a0-dc68-464d-bba2-04049b4e2b36" containerName="nova-api-api" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.287631 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="9682c9a0-dc68-464d-bba2-04049b4e2b36" containerName="nova-api-log" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.287653 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="9682c9a0-dc68-464d-bba2-04049b4e2b36" containerName="nova-api-api" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.288827 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.292159 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.292340 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.292465 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.303084 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.467744 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0a385be-22ab-421b-bdd1-8b45b4aaae47-logs\") pod \"nova-api-0\" (UID: \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\") " pod="openstack/nova-api-0" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.467817 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a385be-22ab-421b-bdd1-8b45b4aaae47-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\") " pod="openstack/nova-api-0" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.467957 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a385be-22ab-421b-bdd1-8b45b4aaae47-public-tls-certs\") pod \"nova-api-0\" (UID: \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\") " pod="openstack/nova-api-0" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.468048 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a385be-22ab-421b-bdd1-8b45b4aaae47-config-data\") pod \"nova-api-0\" (UID: \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\") " pod="openstack/nova-api-0" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.468149 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmrj5\" (UniqueName: \"kubernetes.io/projected/e0a385be-22ab-421b-bdd1-8b45b4aaae47-kube-api-access-pmrj5\") pod \"nova-api-0\" (UID: \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\") " pod="openstack/nova-api-0" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.468311 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a385be-22ab-421b-bdd1-8b45b4aaae47-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\") " pod="openstack/nova-api-0" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.518069 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9682c9a0-dc68-464d-bba2-04049b4e2b36" path="/var/lib/kubelet/pods/9682c9a0-dc68-464d-bba2-04049b4e2b36/volumes" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.570005 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a385be-22ab-421b-bdd1-8b45b4aaae47-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\") " pod="openstack/nova-api-0" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.570062 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a385be-22ab-421b-bdd1-8b45b4aaae47-public-tls-certs\") pod \"nova-api-0\" (UID: \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\") " pod="openstack/nova-api-0" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.570097 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a385be-22ab-421b-bdd1-8b45b4aaae47-config-data\") pod \"nova-api-0\" (UID: \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\") " pod="openstack/nova-api-0" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.570144 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmrj5\" (UniqueName: \"kubernetes.io/projected/e0a385be-22ab-421b-bdd1-8b45b4aaae47-kube-api-access-pmrj5\") pod \"nova-api-0\" (UID: \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\") " pod="openstack/nova-api-0" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.570196 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a385be-22ab-421b-bdd1-8b45b4aaae47-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\") " pod="openstack/nova-api-0" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.570294 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0a385be-22ab-421b-bdd1-8b45b4aaae47-logs\") pod \"nova-api-0\" (UID: \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\") " pod="openstack/nova-api-0" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.570787 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0a385be-22ab-421b-bdd1-8b45b4aaae47-logs\") pod \"nova-api-0\" (UID: \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\") " pod="openstack/nova-api-0" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.575525 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a385be-22ab-421b-bdd1-8b45b4aaae47-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\") " pod="openstack/nova-api-0" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.582333 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a385be-22ab-421b-bdd1-8b45b4aaae47-config-data\") pod \"nova-api-0\" (UID: \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\") " pod="openstack/nova-api-0" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.599323 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a385be-22ab-421b-bdd1-8b45b4aaae47-public-tls-certs\") pod \"nova-api-0\" (UID: \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\") " pod="openstack/nova-api-0" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.599666 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a385be-22ab-421b-bdd1-8b45b4aaae47-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\") " pod="openstack/nova-api-0" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.604271 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmrj5\" (UniqueName: \"kubernetes.io/projected/e0a385be-22ab-421b-bdd1-8b45b4aaae47-kube-api-access-pmrj5\") pod \"nova-api-0\" (UID: \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\") " pod="openstack/nova-api-0" Jan 10 16:46:36 crc kubenswrapper[5036]: I0110 16:46:36.618309 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 10 16:46:37 crc kubenswrapper[5036]: I0110 16:46:37.138791 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 10 16:46:37 crc kubenswrapper[5036]: W0110 16:46:37.139890 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0a385be_22ab_421b_bdd1_8b45b4aaae47.slice/crio-528dc536bdc8f3bb97878827101fcca42b9780f8fbd22efa9e0fe075953ca444 WatchSource:0}: Error finding container 528dc536bdc8f3bb97878827101fcca42b9780f8fbd22efa9e0fe075953ca444: Status 404 returned error can't find the container with id 528dc536bdc8f3bb97878827101fcca42b9780f8fbd22efa9e0fe075953ca444 Jan 10 16:46:37 crc kubenswrapper[5036]: I0110 16:46:37.222711 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e0a385be-22ab-421b-bdd1-8b45b4aaae47","Type":"ContainerStarted","Data":"528dc536bdc8f3bb97878827101fcca42b9780f8fbd22efa9e0fe075953ca444"} Jan 10 16:46:37 crc kubenswrapper[5036]: I0110 16:46:37.225728 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7","Type":"ContainerStarted","Data":"bdb6d471dc67978377d531b907db8ae43857a59da3fc030c32e9911c851c92df"} Jan 10 16:46:38 crc kubenswrapper[5036]: I0110 16:46:38.236652 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7","Type":"ContainerStarted","Data":"87f287e589de8e951628d2f6583bc39b1ce289181410c6c1d0d58d4cf4352c9c"} Jan 10 16:46:38 crc kubenswrapper[5036]: I0110 16:46:38.239430 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e0a385be-22ab-421b-bdd1-8b45b4aaae47","Type":"ContainerStarted","Data":"749036d8bf1b77baee04cef17ccfa536e0d95170a2df1ac4c12fbd85141f1bc5"} Jan 10 16:46:38 crc kubenswrapper[5036]: I0110 16:46:38.239458 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e0a385be-22ab-421b-bdd1-8b45b4aaae47","Type":"ContainerStarted","Data":"ba52c8e9f90b4494befcf61873a3c2b416685b5305ff3da8030cde082ef32a8b"} Jan 10 16:46:38 crc kubenswrapper[5036]: I0110 16:46:38.267073 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.2670559519999998 podStartE2EDuration="2.267055952s" podCreationTimestamp="2026-01-10 16:46:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:46:38.262919075 +0000 UTC m=+1120.133154579" watchObservedRunningTime="2026-01-10 16:46:38.267055952 +0000 UTC m=+1120.137291446" Jan 10 16:46:38 crc kubenswrapper[5036]: I0110 16:46:38.496663 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:46:38 crc kubenswrapper[5036]: I0110 16:46:38.530146 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:46:39 crc kubenswrapper[5036]: I0110 16:46:39.251191 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7","Type":"ContainerStarted","Data":"2f69a058e801b8c19d6ee86a897a2e0c0a251d3c018051982d1e811a6a348011"} Jan 10 16:46:39 crc kubenswrapper[5036]: I0110 16:46:39.286484 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.672389938 podStartE2EDuration="5.286464352s" podCreationTimestamp="2026-01-10 16:46:34 +0000 UTC" firstStartedPulling="2026-01-10 16:46:35.109253055 +0000 UTC m=+1116.979488559" lastFinishedPulling="2026-01-10 16:46:38.723327469 +0000 UTC m=+1120.593562973" observedRunningTime="2026-01-10 16:46:39.276000835 +0000 UTC m=+1121.146236349" watchObservedRunningTime="2026-01-10 16:46:39.286464352 +0000 UTC m=+1121.156699846" Jan 10 16:46:39 crc kubenswrapper[5036]: I0110 16:46:39.289742 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 10 16:46:39 crc kubenswrapper[5036]: I0110 16:46:39.446322 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-ttt6f"] Jan 10 16:46:39 crc kubenswrapper[5036]: I0110 16:46:39.447750 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ttt6f" Jan 10 16:46:39 crc kubenswrapper[5036]: I0110 16:46:39.458957 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 10 16:46:39 crc kubenswrapper[5036]: I0110 16:46:39.459313 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 10 16:46:39 crc kubenswrapper[5036]: I0110 16:46:39.462197 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ttt6f"] Jan 10 16:46:39 crc kubenswrapper[5036]: I0110 16:46:39.544650 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62ba54f7-a8b0-4836-b983-63702bb4c94d-scripts\") pod \"nova-cell1-cell-mapping-ttt6f\" (UID: \"62ba54f7-a8b0-4836-b983-63702bb4c94d\") " pod="openstack/nova-cell1-cell-mapping-ttt6f" Jan 10 16:46:39 crc kubenswrapper[5036]: I0110 16:46:39.544735 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phrn4\" (UniqueName: \"kubernetes.io/projected/62ba54f7-a8b0-4836-b983-63702bb4c94d-kube-api-access-phrn4\") pod \"nova-cell1-cell-mapping-ttt6f\" (UID: \"62ba54f7-a8b0-4836-b983-63702bb4c94d\") " pod="openstack/nova-cell1-cell-mapping-ttt6f" Jan 10 16:46:39 crc kubenswrapper[5036]: I0110 16:46:39.544802 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62ba54f7-a8b0-4836-b983-63702bb4c94d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ttt6f\" (UID: \"62ba54f7-a8b0-4836-b983-63702bb4c94d\") " pod="openstack/nova-cell1-cell-mapping-ttt6f" Jan 10 16:46:39 crc kubenswrapper[5036]: I0110 16:46:39.544838 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62ba54f7-a8b0-4836-b983-63702bb4c94d-config-data\") pod \"nova-cell1-cell-mapping-ttt6f\" (UID: \"62ba54f7-a8b0-4836-b983-63702bb4c94d\") " pod="openstack/nova-cell1-cell-mapping-ttt6f" Jan 10 16:46:39 crc kubenswrapper[5036]: I0110 16:46:39.648744 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62ba54f7-a8b0-4836-b983-63702bb4c94d-scripts\") pod \"nova-cell1-cell-mapping-ttt6f\" (UID: \"62ba54f7-a8b0-4836-b983-63702bb4c94d\") " pod="openstack/nova-cell1-cell-mapping-ttt6f" Jan 10 16:46:39 crc kubenswrapper[5036]: I0110 16:46:39.648821 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phrn4\" (UniqueName: \"kubernetes.io/projected/62ba54f7-a8b0-4836-b983-63702bb4c94d-kube-api-access-phrn4\") pod \"nova-cell1-cell-mapping-ttt6f\" (UID: \"62ba54f7-a8b0-4836-b983-63702bb4c94d\") " pod="openstack/nova-cell1-cell-mapping-ttt6f" Jan 10 16:46:39 crc kubenswrapper[5036]: I0110 16:46:39.648896 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62ba54f7-a8b0-4836-b983-63702bb4c94d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ttt6f\" (UID: \"62ba54f7-a8b0-4836-b983-63702bb4c94d\") " pod="openstack/nova-cell1-cell-mapping-ttt6f" Jan 10 16:46:39 crc kubenswrapper[5036]: I0110 16:46:39.648934 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62ba54f7-a8b0-4836-b983-63702bb4c94d-config-data\") pod \"nova-cell1-cell-mapping-ttt6f\" (UID: \"62ba54f7-a8b0-4836-b983-63702bb4c94d\") " pod="openstack/nova-cell1-cell-mapping-ttt6f" Jan 10 16:46:39 crc kubenswrapper[5036]: I0110 16:46:39.654311 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62ba54f7-a8b0-4836-b983-63702bb4c94d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ttt6f\" (UID: \"62ba54f7-a8b0-4836-b983-63702bb4c94d\") " pod="openstack/nova-cell1-cell-mapping-ttt6f" Jan 10 16:46:39 crc kubenswrapper[5036]: I0110 16:46:39.654365 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62ba54f7-a8b0-4836-b983-63702bb4c94d-config-data\") pod \"nova-cell1-cell-mapping-ttt6f\" (UID: \"62ba54f7-a8b0-4836-b983-63702bb4c94d\") " pod="openstack/nova-cell1-cell-mapping-ttt6f" Jan 10 16:46:39 crc kubenswrapper[5036]: I0110 16:46:39.668054 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62ba54f7-a8b0-4836-b983-63702bb4c94d-scripts\") pod \"nova-cell1-cell-mapping-ttt6f\" (UID: \"62ba54f7-a8b0-4836-b983-63702bb4c94d\") " pod="openstack/nova-cell1-cell-mapping-ttt6f" Jan 10 16:46:39 crc kubenswrapper[5036]: I0110 16:46:39.669176 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phrn4\" (UniqueName: \"kubernetes.io/projected/62ba54f7-a8b0-4836-b983-63702bb4c94d-kube-api-access-phrn4\") pod \"nova-cell1-cell-mapping-ttt6f\" (UID: \"62ba54f7-a8b0-4836-b983-63702bb4c94d\") " pod="openstack/nova-cell1-cell-mapping-ttt6f" Jan 10 16:46:39 crc kubenswrapper[5036]: I0110 16:46:39.776478 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ttt6f" Jan 10 16:46:39 crc kubenswrapper[5036]: I0110 16:46:39.888736 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" Jan 10 16:46:40 crc kubenswrapper[5036]: I0110 16:46:40.021603 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-q9t6n"] Jan 10 16:46:40 crc kubenswrapper[5036]: I0110 16:46:40.022145 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" podUID="cbbe955a-7e12-4ed0-a795-4f182840d5e2" containerName="dnsmasq-dns" containerID="cri-o://acb2bdc31b10187a0578b148acc28dcee1334599feeb326394dcd30113661bee" gracePeriod=10 Jan 10 16:46:40 crc kubenswrapper[5036]: I0110 16:46:40.280405 5036 generic.go:334] "Generic (PLEG): container finished" podID="cbbe955a-7e12-4ed0-a795-4f182840d5e2" containerID="acb2bdc31b10187a0578b148acc28dcee1334599feeb326394dcd30113661bee" exitCode=0 Jan 10 16:46:40 crc kubenswrapper[5036]: I0110 16:46:40.281012 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" event={"ID":"cbbe955a-7e12-4ed0-a795-4f182840d5e2","Type":"ContainerDied","Data":"acb2bdc31b10187a0578b148acc28dcee1334599feeb326394dcd30113661bee"} Jan 10 16:46:40 crc kubenswrapper[5036]: I0110 16:46:40.281524 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 10 16:46:40 crc kubenswrapper[5036]: I0110 16:46:40.310824 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ttt6f"] Jan 10 16:46:40 crc kubenswrapper[5036]: I0110 16:46:40.541193 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" Jan 10 16:46:40 crc kubenswrapper[5036]: I0110 16:46:40.688745 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbbe955a-7e12-4ed0-a795-4f182840d5e2-ovsdbserver-sb\") pod \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\" (UID: \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\") " Jan 10 16:46:40 crc kubenswrapper[5036]: I0110 16:46:40.688885 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbbe955a-7e12-4ed0-a795-4f182840d5e2-dns-svc\") pod \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\" (UID: \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\") " Jan 10 16:46:40 crc kubenswrapper[5036]: I0110 16:46:40.688963 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwffv\" (UniqueName: \"kubernetes.io/projected/cbbe955a-7e12-4ed0-a795-4f182840d5e2-kube-api-access-kwffv\") pod \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\" (UID: \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\") " Jan 10 16:46:40 crc kubenswrapper[5036]: I0110 16:46:40.688993 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbbe955a-7e12-4ed0-a795-4f182840d5e2-config\") pod \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\" (UID: \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\") " Jan 10 16:46:40 crc kubenswrapper[5036]: I0110 16:46:40.689119 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbbe955a-7e12-4ed0-a795-4f182840d5e2-ovsdbserver-nb\") pod \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\" (UID: \"cbbe955a-7e12-4ed0-a795-4f182840d5e2\") " Jan 10 16:46:40 crc kubenswrapper[5036]: I0110 16:46:40.707818 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbbe955a-7e12-4ed0-a795-4f182840d5e2-kube-api-access-kwffv" (OuterVolumeSpecName: "kube-api-access-kwffv") pod "cbbe955a-7e12-4ed0-a795-4f182840d5e2" (UID: "cbbe955a-7e12-4ed0-a795-4f182840d5e2"). InnerVolumeSpecName "kube-api-access-kwffv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:46:40 crc kubenswrapper[5036]: I0110 16:46:40.782447 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbbe955a-7e12-4ed0-a795-4f182840d5e2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cbbe955a-7e12-4ed0-a795-4f182840d5e2" (UID: "cbbe955a-7e12-4ed0-a795-4f182840d5e2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:46:40 crc kubenswrapper[5036]: I0110 16:46:40.792658 5036 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cbbe955a-7e12-4ed0-a795-4f182840d5e2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:40 crc kubenswrapper[5036]: I0110 16:46:40.792719 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwffv\" (UniqueName: \"kubernetes.io/projected/cbbe955a-7e12-4ed0-a795-4f182840d5e2-kube-api-access-kwffv\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:40 crc kubenswrapper[5036]: I0110 16:46:40.806180 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbbe955a-7e12-4ed0-a795-4f182840d5e2-config" (OuterVolumeSpecName: "config") pod "cbbe955a-7e12-4ed0-a795-4f182840d5e2" (UID: "cbbe955a-7e12-4ed0-a795-4f182840d5e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:46:40 crc kubenswrapper[5036]: I0110 16:46:40.814144 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbbe955a-7e12-4ed0-a795-4f182840d5e2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cbbe955a-7e12-4ed0-a795-4f182840d5e2" (UID: "cbbe955a-7e12-4ed0-a795-4f182840d5e2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:46:40 crc kubenswrapper[5036]: I0110 16:46:40.820154 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbbe955a-7e12-4ed0-a795-4f182840d5e2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cbbe955a-7e12-4ed0-a795-4f182840d5e2" (UID: "cbbe955a-7e12-4ed0-a795-4f182840d5e2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:46:40 crc kubenswrapper[5036]: I0110 16:46:40.896085 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbbe955a-7e12-4ed0-a795-4f182840d5e2-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:40 crc kubenswrapper[5036]: I0110 16:46:40.896140 5036 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cbbe955a-7e12-4ed0-a795-4f182840d5e2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:40 crc kubenswrapper[5036]: I0110 16:46:40.896155 5036 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbbe955a-7e12-4ed0-a795-4f182840d5e2-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:41 crc kubenswrapper[5036]: I0110 16:46:41.302411 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ttt6f" event={"ID":"62ba54f7-a8b0-4836-b983-63702bb4c94d","Type":"ContainerStarted","Data":"8691cb49074c75a92edd84be9cdc3691890a2b9ce024486d692d4a9988501753"} Jan 10 16:46:41 crc kubenswrapper[5036]: I0110 16:46:41.302463 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ttt6f" event={"ID":"62ba54f7-a8b0-4836-b983-63702bb4c94d","Type":"ContainerStarted","Data":"a598b03f0b0ff4343c026934c4fda939e169227dff9da34726a01cf9cdd1bb74"} Jan 10 16:46:41 crc kubenswrapper[5036]: I0110 16:46:41.316999 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-ttt6f" podStartSLOduration=2.316981824 podStartE2EDuration="2.316981824s" podCreationTimestamp="2026-01-10 16:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:46:41.315940075 +0000 UTC m=+1123.186175569" watchObservedRunningTime="2026-01-10 16:46:41.316981824 +0000 UTC m=+1123.187217318" Jan 10 16:46:41 crc kubenswrapper[5036]: I0110 16:46:41.320361 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" Jan 10 16:46:41 crc kubenswrapper[5036]: I0110 16:46:41.320375 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-q9t6n" event={"ID":"cbbe955a-7e12-4ed0-a795-4f182840d5e2","Type":"ContainerDied","Data":"ec414d17de08af97d1a50e640090c5c04f2bff5800e2194d4e480087d7a953bb"} Jan 10 16:46:41 crc kubenswrapper[5036]: I0110 16:46:41.320453 5036 scope.go:117] "RemoveContainer" containerID="acb2bdc31b10187a0578b148acc28dcee1334599feeb326394dcd30113661bee" Jan 10 16:46:41 crc kubenswrapper[5036]: I0110 16:46:41.343482 5036 scope.go:117] "RemoveContainer" containerID="99e6a6cd944efa6b4ca679b555841f1f50d263bf17e301f621bd91bd7bb7918d" Jan 10 16:46:41 crc kubenswrapper[5036]: I0110 16:46:41.355030 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-q9t6n"] Jan 10 16:46:41 crc kubenswrapper[5036]: I0110 16:46:41.362466 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-q9t6n"] Jan 10 16:46:42 crc kubenswrapper[5036]: I0110 16:46:42.517835 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbbe955a-7e12-4ed0-a795-4f182840d5e2" path="/var/lib/kubelet/pods/cbbe955a-7e12-4ed0-a795-4f182840d5e2/volumes" Jan 10 16:46:45 crc kubenswrapper[5036]: I0110 16:46:45.369625 5036 generic.go:334] "Generic (PLEG): container finished" podID="62ba54f7-a8b0-4836-b983-63702bb4c94d" containerID="8691cb49074c75a92edd84be9cdc3691890a2b9ce024486d692d4a9988501753" exitCode=0 Jan 10 16:46:45 crc kubenswrapper[5036]: I0110 16:46:45.369698 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ttt6f" event={"ID":"62ba54f7-a8b0-4836-b983-63702bb4c94d","Type":"ContainerDied","Data":"8691cb49074c75a92edd84be9cdc3691890a2b9ce024486d692d4a9988501753"} Jan 10 16:46:46 crc kubenswrapper[5036]: I0110 16:46:46.619760 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 10 16:46:46 crc kubenswrapper[5036]: I0110 16:46:46.619826 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 10 16:46:46 crc kubenswrapper[5036]: I0110 16:46:46.791070 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ttt6f" Jan 10 16:46:46 crc kubenswrapper[5036]: I0110 16:46:46.910263 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62ba54f7-a8b0-4836-b983-63702bb4c94d-scripts\") pod \"62ba54f7-a8b0-4836-b983-63702bb4c94d\" (UID: \"62ba54f7-a8b0-4836-b983-63702bb4c94d\") " Jan 10 16:46:46 crc kubenswrapper[5036]: I0110 16:46:46.910571 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phrn4\" (UniqueName: \"kubernetes.io/projected/62ba54f7-a8b0-4836-b983-63702bb4c94d-kube-api-access-phrn4\") pod \"62ba54f7-a8b0-4836-b983-63702bb4c94d\" (UID: \"62ba54f7-a8b0-4836-b983-63702bb4c94d\") " Jan 10 16:46:46 crc kubenswrapper[5036]: I0110 16:46:46.910616 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62ba54f7-a8b0-4836-b983-63702bb4c94d-config-data\") pod \"62ba54f7-a8b0-4836-b983-63702bb4c94d\" (UID: \"62ba54f7-a8b0-4836-b983-63702bb4c94d\") " Jan 10 16:46:46 crc kubenswrapper[5036]: I0110 16:46:46.911149 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62ba54f7-a8b0-4836-b983-63702bb4c94d-combined-ca-bundle\") pod \"62ba54f7-a8b0-4836-b983-63702bb4c94d\" (UID: \"62ba54f7-a8b0-4836-b983-63702bb4c94d\") " Jan 10 16:46:46 crc kubenswrapper[5036]: I0110 16:46:46.917308 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62ba54f7-a8b0-4836-b983-63702bb4c94d-scripts" (OuterVolumeSpecName: "scripts") pod "62ba54f7-a8b0-4836-b983-63702bb4c94d" (UID: "62ba54f7-a8b0-4836-b983-63702bb4c94d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:46 crc kubenswrapper[5036]: I0110 16:46:46.928815 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62ba54f7-a8b0-4836-b983-63702bb4c94d-kube-api-access-phrn4" (OuterVolumeSpecName: "kube-api-access-phrn4") pod "62ba54f7-a8b0-4836-b983-63702bb4c94d" (UID: "62ba54f7-a8b0-4836-b983-63702bb4c94d"). InnerVolumeSpecName "kube-api-access-phrn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:46:46 crc kubenswrapper[5036]: I0110 16:46:46.946140 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62ba54f7-a8b0-4836-b983-63702bb4c94d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62ba54f7-a8b0-4836-b983-63702bb4c94d" (UID: "62ba54f7-a8b0-4836-b983-63702bb4c94d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:46 crc kubenswrapper[5036]: I0110 16:46:46.963854 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62ba54f7-a8b0-4836-b983-63702bb4c94d-config-data" (OuterVolumeSpecName: "config-data") pod "62ba54f7-a8b0-4836-b983-63702bb4c94d" (UID: "62ba54f7-a8b0-4836-b983-63702bb4c94d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:47 crc kubenswrapper[5036]: I0110 16:46:47.013799 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phrn4\" (UniqueName: \"kubernetes.io/projected/62ba54f7-a8b0-4836-b983-63702bb4c94d-kube-api-access-phrn4\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:47 crc kubenswrapper[5036]: I0110 16:46:47.013853 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62ba54f7-a8b0-4836-b983-63702bb4c94d-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:47 crc kubenswrapper[5036]: I0110 16:46:47.013871 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62ba54f7-a8b0-4836-b983-63702bb4c94d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:47 crc kubenswrapper[5036]: I0110 16:46:47.013893 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62ba54f7-a8b0-4836-b983-63702bb4c94d-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:47 crc kubenswrapper[5036]: I0110 16:46:47.411374 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ttt6f" event={"ID":"62ba54f7-a8b0-4836-b983-63702bb4c94d","Type":"ContainerDied","Data":"a598b03f0b0ff4343c026934c4fda939e169227dff9da34726a01cf9cdd1bb74"} Jan 10 16:46:47 crc kubenswrapper[5036]: I0110 16:46:47.411421 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a598b03f0b0ff4343c026934c4fda939e169227dff9da34726a01cf9cdd1bb74" Jan 10 16:46:47 crc kubenswrapper[5036]: I0110 16:46:47.411529 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ttt6f" Jan 10 16:46:47 crc kubenswrapper[5036]: I0110 16:46:47.585415 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 10 16:46:47 crc kubenswrapper[5036]: I0110 16:46:47.585748 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e0a385be-22ab-421b-bdd1-8b45b4aaae47" containerName="nova-api-api" containerID="cri-o://749036d8bf1b77baee04cef17ccfa536e0d95170a2df1ac4c12fbd85141f1bc5" gracePeriod=30 Jan 10 16:46:47 crc kubenswrapper[5036]: I0110 16:46:47.589467 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e0a385be-22ab-421b-bdd1-8b45b4aaae47" containerName="nova-api-log" containerID="cri-o://ba52c8e9f90b4494befcf61873a3c2b416685b5305ff3da8030cde082ef32a8b" gracePeriod=30 Jan 10 16:46:47 crc kubenswrapper[5036]: I0110 16:46:47.595376 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e0a385be-22ab-421b-bdd1-8b45b4aaae47" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.185:8774/\": EOF" Jan 10 16:46:47 crc kubenswrapper[5036]: I0110 16:46:47.595377 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e0a385be-22ab-421b-bdd1-8b45b4aaae47" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.185:8774/\": EOF" Jan 10 16:46:47 crc kubenswrapper[5036]: I0110 16:46:47.625575 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 10 16:46:47 crc kubenswrapper[5036]: I0110 16:46:47.625800 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa" containerName="nova-scheduler-scheduler" containerID="cri-o://2419bc8ef43251b9a30022bd39e79403103ca47db5e1838f25d20ebc5af7b0ec" gracePeriod=30 Jan 10 16:46:47 crc kubenswrapper[5036]: I0110 16:46:47.634773 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 10 16:46:47 crc kubenswrapper[5036]: I0110 16:46:47.635253 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8ebf6309-92e5-4223-93b6-93138eb0c7e5" containerName="nova-metadata-log" containerID="cri-o://882f394475a1388cdcb8effafc58c3fa2f30eca6637ce2c21e38ead5a0940586" gracePeriod=30 Jan 10 16:46:47 crc kubenswrapper[5036]: I0110 16:46:47.635307 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8ebf6309-92e5-4223-93b6-93138eb0c7e5" containerName="nova-metadata-metadata" containerID="cri-o://7e183f3db963e65b3bc700aa9511a2abaa10b950452acb7bcb64b21b470218ac" gracePeriod=30 Jan 10 16:46:48 crc kubenswrapper[5036]: I0110 16:46:48.420454 5036 generic.go:334] "Generic (PLEG): container finished" podID="8ebf6309-92e5-4223-93b6-93138eb0c7e5" containerID="882f394475a1388cdcb8effafc58c3fa2f30eca6637ce2c21e38ead5a0940586" exitCode=143 Jan 10 16:46:48 crc kubenswrapper[5036]: I0110 16:46:48.420526 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8ebf6309-92e5-4223-93b6-93138eb0c7e5","Type":"ContainerDied","Data":"882f394475a1388cdcb8effafc58c3fa2f30eca6637ce2c21e38ead5a0940586"} Jan 10 16:46:48 crc kubenswrapper[5036]: I0110 16:46:48.422279 5036 generic.go:334] "Generic (PLEG): container finished" podID="e0a385be-22ab-421b-bdd1-8b45b4aaae47" containerID="ba52c8e9f90b4494befcf61873a3c2b416685b5305ff3da8030cde082ef32a8b" exitCode=143 Jan 10 16:46:48 crc kubenswrapper[5036]: I0110 16:46:48.422330 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e0a385be-22ab-421b-bdd1-8b45b4aaae47","Type":"ContainerDied","Data":"ba52c8e9f90b4494befcf61873a3c2b416685b5305ff3da8030cde082ef32a8b"} Jan 10 16:46:49 crc kubenswrapper[5036]: E0110 16:46:49.323822 5036 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2419bc8ef43251b9a30022bd39e79403103ca47db5e1838f25d20ebc5af7b0ec" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 10 16:46:49 crc kubenswrapper[5036]: E0110 16:46:49.326033 5036 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2419bc8ef43251b9a30022bd39e79403103ca47db5e1838f25d20ebc5af7b0ec" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 10 16:46:49 crc kubenswrapper[5036]: E0110 16:46:49.332249 5036 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2419bc8ef43251b9a30022bd39e79403103ca47db5e1838f25d20ebc5af7b0ec" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 10 16:46:49 crc kubenswrapper[5036]: E0110 16:46:49.332312 5036 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa" containerName="nova-scheduler-scheduler" Jan 10 16:46:50 crc kubenswrapper[5036]: I0110 16:46:50.777634 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="8ebf6309-92e5-4223-93b6-93138eb0c7e5" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.178:8775/\": read tcp 10.217.0.2:53566->10.217.0.178:8775: read: connection reset by peer" Jan 10 16:46:50 crc kubenswrapper[5036]: I0110 16:46:50.777650 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="8ebf6309-92e5-4223-93b6-93138eb0c7e5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.178:8775/\": read tcp 10.217.0.2:53552->10.217.0.178:8775: read: connection reset by peer" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.214813 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.304090 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ebf6309-92e5-4223-93b6-93138eb0c7e5-config-data\") pod \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\" (UID: \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\") " Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.304163 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ebf6309-92e5-4223-93b6-93138eb0c7e5-nova-metadata-tls-certs\") pod \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\" (UID: \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\") " Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.304197 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ebf6309-92e5-4223-93b6-93138eb0c7e5-logs\") pod \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\" (UID: \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\") " Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.304230 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ebf6309-92e5-4223-93b6-93138eb0c7e5-combined-ca-bundle\") pod \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\" (UID: \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\") " Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.304262 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsqdm\" (UniqueName: \"kubernetes.io/projected/8ebf6309-92e5-4223-93b6-93138eb0c7e5-kube-api-access-tsqdm\") pod \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\" (UID: \"8ebf6309-92e5-4223-93b6-93138eb0c7e5\") " Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.305268 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ebf6309-92e5-4223-93b6-93138eb0c7e5-logs" (OuterVolumeSpecName: "logs") pod "8ebf6309-92e5-4223-93b6-93138eb0c7e5" (UID: "8ebf6309-92e5-4223-93b6-93138eb0c7e5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.313938 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ebf6309-92e5-4223-93b6-93138eb0c7e5-kube-api-access-tsqdm" (OuterVolumeSpecName: "kube-api-access-tsqdm") pod "8ebf6309-92e5-4223-93b6-93138eb0c7e5" (UID: "8ebf6309-92e5-4223-93b6-93138eb0c7e5"). InnerVolumeSpecName "kube-api-access-tsqdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.341219 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ebf6309-92e5-4223-93b6-93138eb0c7e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ebf6309-92e5-4223-93b6-93138eb0c7e5" (UID: "8ebf6309-92e5-4223-93b6-93138eb0c7e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.366921 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ebf6309-92e5-4223-93b6-93138eb0c7e5-config-data" (OuterVolumeSpecName: "config-data") pod "8ebf6309-92e5-4223-93b6-93138eb0c7e5" (UID: "8ebf6309-92e5-4223-93b6-93138eb0c7e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.398397 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ebf6309-92e5-4223-93b6-93138eb0c7e5-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "8ebf6309-92e5-4223-93b6-93138eb0c7e5" (UID: "8ebf6309-92e5-4223-93b6-93138eb0c7e5"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.405946 5036 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ebf6309-92e5-4223-93b6-93138eb0c7e5-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.405976 5036 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ebf6309-92e5-4223-93b6-93138eb0c7e5-logs\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.405986 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ebf6309-92e5-4223-93b6-93138eb0c7e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.405997 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsqdm\" (UniqueName: \"kubernetes.io/projected/8ebf6309-92e5-4223-93b6-93138eb0c7e5-kube-api-access-tsqdm\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.406007 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ebf6309-92e5-4223-93b6-93138eb0c7e5-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.451568 5036 generic.go:334] "Generic (PLEG): container finished" podID="8ebf6309-92e5-4223-93b6-93138eb0c7e5" containerID="7e183f3db963e65b3bc700aa9511a2abaa10b950452acb7bcb64b21b470218ac" exitCode=0 Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.451607 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.451618 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8ebf6309-92e5-4223-93b6-93138eb0c7e5","Type":"ContainerDied","Data":"7e183f3db963e65b3bc700aa9511a2abaa10b950452acb7bcb64b21b470218ac"} Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.451651 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8ebf6309-92e5-4223-93b6-93138eb0c7e5","Type":"ContainerDied","Data":"8fe144268f39591c54dd996a66d3da78191a39ff8afa01528fe668be420bdfe9"} Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.451672 5036 scope.go:117] "RemoveContainer" containerID="7e183f3db963e65b3bc700aa9511a2abaa10b950452acb7bcb64b21b470218ac" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.476586 5036 scope.go:117] "RemoveContainer" containerID="882f394475a1388cdcb8effafc58c3fa2f30eca6637ce2c21e38ead5a0940586" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.493819 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.504455 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.505589 5036 scope.go:117] "RemoveContainer" containerID="7e183f3db963e65b3bc700aa9511a2abaa10b950452acb7bcb64b21b470218ac" Jan 10 16:46:51 crc kubenswrapper[5036]: E0110 16:46:51.506486 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e183f3db963e65b3bc700aa9511a2abaa10b950452acb7bcb64b21b470218ac\": container with ID starting with 7e183f3db963e65b3bc700aa9511a2abaa10b950452acb7bcb64b21b470218ac not found: ID does not exist" containerID="7e183f3db963e65b3bc700aa9511a2abaa10b950452acb7bcb64b21b470218ac" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.506519 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e183f3db963e65b3bc700aa9511a2abaa10b950452acb7bcb64b21b470218ac"} err="failed to get container status \"7e183f3db963e65b3bc700aa9511a2abaa10b950452acb7bcb64b21b470218ac\": rpc error: code = NotFound desc = could not find container \"7e183f3db963e65b3bc700aa9511a2abaa10b950452acb7bcb64b21b470218ac\": container with ID starting with 7e183f3db963e65b3bc700aa9511a2abaa10b950452acb7bcb64b21b470218ac not found: ID does not exist" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.506539 5036 scope.go:117] "RemoveContainer" containerID="882f394475a1388cdcb8effafc58c3fa2f30eca6637ce2c21e38ead5a0940586" Jan 10 16:46:51 crc kubenswrapper[5036]: E0110 16:46:51.510260 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"882f394475a1388cdcb8effafc58c3fa2f30eca6637ce2c21e38ead5a0940586\": container with ID starting with 882f394475a1388cdcb8effafc58c3fa2f30eca6637ce2c21e38ead5a0940586 not found: ID does not exist" containerID="882f394475a1388cdcb8effafc58c3fa2f30eca6637ce2c21e38ead5a0940586" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.510301 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"882f394475a1388cdcb8effafc58c3fa2f30eca6637ce2c21e38ead5a0940586"} err="failed to get container status \"882f394475a1388cdcb8effafc58c3fa2f30eca6637ce2c21e38ead5a0940586\": rpc error: code = NotFound desc = could not find container \"882f394475a1388cdcb8effafc58c3fa2f30eca6637ce2c21e38ead5a0940586\": container with ID starting with 882f394475a1388cdcb8effafc58c3fa2f30eca6637ce2c21e38ead5a0940586 not found: ID does not exist" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.519030 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 10 16:46:51 crc kubenswrapper[5036]: E0110 16:46:51.519396 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbbe955a-7e12-4ed0-a795-4f182840d5e2" containerName="init" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.519418 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbbe955a-7e12-4ed0-a795-4f182840d5e2" containerName="init" Jan 10 16:46:51 crc kubenswrapper[5036]: E0110 16:46:51.519429 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbbe955a-7e12-4ed0-a795-4f182840d5e2" containerName="dnsmasq-dns" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.519439 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbbe955a-7e12-4ed0-a795-4f182840d5e2" containerName="dnsmasq-dns" Jan 10 16:46:51 crc kubenswrapper[5036]: E0110 16:46:51.519457 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ebf6309-92e5-4223-93b6-93138eb0c7e5" containerName="nova-metadata-metadata" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.519465 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ebf6309-92e5-4223-93b6-93138eb0c7e5" containerName="nova-metadata-metadata" Jan 10 16:46:51 crc kubenswrapper[5036]: E0110 16:46:51.519487 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62ba54f7-a8b0-4836-b983-63702bb4c94d" containerName="nova-manage" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.519495 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="62ba54f7-a8b0-4836-b983-63702bb4c94d" containerName="nova-manage" Jan 10 16:46:51 crc kubenswrapper[5036]: E0110 16:46:51.519514 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ebf6309-92e5-4223-93b6-93138eb0c7e5" containerName="nova-metadata-log" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.519522 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ebf6309-92e5-4223-93b6-93138eb0c7e5" containerName="nova-metadata-log" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.519738 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ebf6309-92e5-4223-93b6-93138eb0c7e5" containerName="nova-metadata-metadata" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.519755 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="62ba54f7-a8b0-4836-b983-63702bb4c94d" containerName="nova-manage" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.519769 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ebf6309-92e5-4223-93b6-93138eb0c7e5" containerName="nova-metadata-log" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.519782 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbbe955a-7e12-4ed0-a795-4f182840d5e2" containerName="dnsmasq-dns" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.520709 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.523602 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.523961 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.557233 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.608908 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2dade9a-7926-4c9b-82df-4c525efd69db-logs\") pod \"nova-metadata-0\" (UID: \"f2dade9a-7926-4c9b-82df-4c525efd69db\") " pod="openstack/nova-metadata-0" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.608963 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2dade9a-7926-4c9b-82df-4c525efd69db-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f2dade9a-7926-4c9b-82df-4c525efd69db\") " pod="openstack/nova-metadata-0" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.608980 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hl27\" (UniqueName: \"kubernetes.io/projected/f2dade9a-7926-4c9b-82df-4c525efd69db-kube-api-access-5hl27\") pod \"nova-metadata-0\" (UID: \"f2dade9a-7926-4c9b-82df-4c525efd69db\") " pod="openstack/nova-metadata-0" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.609045 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2dade9a-7926-4c9b-82df-4c525efd69db-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f2dade9a-7926-4c9b-82df-4c525efd69db\") " pod="openstack/nova-metadata-0" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.609119 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2dade9a-7926-4c9b-82df-4c525efd69db-config-data\") pod \"nova-metadata-0\" (UID: \"f2dade9a-7926-4c9b-82df-4c525efd69db\") " pod="openstack/nova-metadata-0" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.710189 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2dade9a-7926-4c9b-82df-4c525efd69db-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f2dade9a-7926-4c9b-82df-4c525efd69db\") " pod="openstack/nova-metadata-0" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.710238 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hl27\" (UniqueName: \"kubernetes.io/projected/f2dade9a-7926-4c9b-82df-4c525efd69db-kube-api-access-5hl27\") pod \"nova-metadata-0\" (UID: \"f2dade9a-7926-4c9b-82df-4c525efd69db\") " pod="openstack/nova-metadata-0" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.710303 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2dade9a-7926-4c9b-82df-4c525efd69db-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f2dade9a-7926-4c9b-82df-4c525efd69db\") " pod="openstack/nova-metadata-0" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.710364 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2dade9a-7926-4c9b-82df-4c525efd69db-config-data\") pod \"nova-metadata-0\" (UID: \"f2dade9a-7926-4c9b-82df-4c525efd69db\") " pod="openstack/nova-metadata-0" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.710415 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2dade9a-7926-4c9b-82df-4c525efd69db-logs\") pod \"nova-metadata-0\" (UID: \"f2dade9a-7926-4c9b-82df-4c525efd69db\") " pod="openstack/nova-metadata-0" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.710884 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2dade9a-7926-4c9b-82df-4c525efd69db-logs\") pod \"nova-metadata-0\" (UID: \"f2dade9a-7926-4c9b-82df-4c525efd69db\") " pod="openstack/nova-metadata-0" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.714159 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2dade9a-7926-4c9b-82df-4c525efd69db-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f2dade9a-7926-4c9b-82df-4c525efd69db\") " pod="openstack/nova-metadata-0" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.714322 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2dade9a-7926-4c9b-82df-4c525efd69db-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f2dade9a-7926-4c9b-82df-4c525efd69db\") " pod="openstack/nova-metadata-0" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.714773 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2dade9a-7926-4c9b-82df-4c525efd69db-config-data\") pod \"nova-metadata-0\" (UID: \"f2dade9a-7926-4c9b-82df-4c525efd69db\") " pod="openstack/nova-metadata-0" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.726914 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hl27\" (UniqueName: \"kubernetes.io/projected/f2dade9a-7926-4c9b-82df-4c525efd69db-kube-api-access-5hl27\") pod \"nova-metadata-0\" (UID: \"f2dade9a-7926-4c9b-82df-4c525efd69db\") " pod="openstack/nova-metadata-0" Jan 10 16:46:51 crc kubenswrapper[5036]: I0110 16:46:51.861953 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 10 16:46:52 crc kubenswrapper[5036]: I0110 16:46:52.366392 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 10 16:46:52 crc kubenswrapper[5036]: I0110 16:46:52.460114 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f2dade9a-7926-4c9b-82df-4c525efd69db","Type":"ContainerStarted","Data":"fca01a0e664a1317cea919381a18f616acd50c83449cb6d73950a204df5ef0d5"} Jan 10 16:46:52 crc kubenswrapper[5036]: I0110 16:46:52.519161 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ebf6309-92e5-4223-93b6-93138eb0c7e5" path="/var/lib/kubelet/pods/8ebf6309-92e5-4223-93b6-93138eb0c7e5/volumes" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.302885 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.375958 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.440809 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqbs7\" (UniqueName: \"kubernetes.io/projected/2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa-kube-api-access-wqbs7\") pod \"2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa\" (UID: \"2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa\") " Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.441092 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa-config-data\") pod \"2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa\" (UID: \"2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa\") " Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.441172 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa-combined-ca-bundle\") pod \"2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa\" (UID: \"2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa\") " Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.450550 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa-kube-api-access-wqbs7" (OuterVolumeSpecName: "kube-api-access-wqbs7") pod "2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa" (UID: "2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa"). InnerVolumeSpecName "kube-api-access-wqbs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.466437 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa" (UID: "2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.468792 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa-config-data" (OuterVolumeSpecName: "config-data") pod "2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa" (UID: "2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.471401 5036 generic.go:334] "Generic (PLEG): container finished" podID="e0a385be-22ab-421b-bdd1-8b45b4aaae47" containerID="749036d8bf1b77baee04cef17ccfa536e0d95170a2df1ac4c12fbd85141f1bc5" exitCode=0 Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.471459 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e0a385be-22ab-421b-bdd1-8b45b4aaae47","Type":"ContainerDied","Data":"749036d8bf1b77baee04cef17ccfa536e0d95170a2df1ac4c12fbd85141f1bc5"} Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.471487 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e0a385be-22ab-421b-bdd1-8b45b4aaae47","Type":"ContainerDied","Data":"528dc536bdc8f3bb97878827101fcca42b9780f8fbd22efa9e0fe075953ca444"} Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.471504 5036 scope.go:117] "RemoveContainer" containerID="749036d8bf1b77baee04cef17ccfa536e0d95170a2df1ac4c12fbd85141f1bc5" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.471623 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.476610 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f2dade9a-7926-4c9b-82df-4c525efd69db","Type":"ContainerStarted","Data":"f867f7053af0e6e1955136b6ac384f3170bf26fb39be2156a600ffd58e507e70"} Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.476662 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f2dade9a-7926-4c9b-82df-4c525efd69db","Type":"ContainerStarted","Data":"bc5f0ec14c27b6fa4901d849609a3545fc2fbb763f1f662ce9bc8ebd1e65ea2e"} Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.478837 5036 generic.go:334] "Generic (PLEG): container finished" podID="2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa" containerID="2419bc8ef43251b9a30022bd39e79403103ca47db5e1838f25d20ebc5af7b0ec" exitCode=0 Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.478891 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.478897 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa","Type":"ContainerDied","Data":"2419bc8ef43251b9a30022bd39e79403103ca47db5e1838f25d20ebc5af7b0ec"} Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.478953 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa","Type":"ContainerDied","Data":"dad126616a282e50d8bf0fe27890e274e2dddbccfd6997eae31d78fe1b4e8072"} Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.495619 5036 scope.go:117] "RemoveContainer" containerID="ba52c8e9f90b4494befcf61873a3c2b416685b5305ff3da8030cde082ef32a8b" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.501516 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.501496022 podStartE2EDuration="2.501496022s" podCreationTimestamp="2026-01-10 16:46:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:46:53.495241914 +0000 UTC m=+1135.365477428" watchObservedRunningTime="2026-01-10 16:46:53.501496022 +0000 UTC m=+1135.371731516" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.525144 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.531715 5036 scope.go:117] "RemoveContainer" containerID="749036d8bf1b77baee04cef17ccfa536e0d95170a2df1ac4c12fbd85141f1bc5" Jan 10 16:46:53 crc kubenswrapper[5036]: E0110 16:46:53.532176 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"749036d8bf1b77baee04cef17ccfa536e0d95170a2df1ac4c12fbd85141f1bc5\": container with ID starting with 749036d8bf1b77baee04cef17ccfa536e0d95170a2df1ac4c12fbd85141f1bc5 not found: ID does not exist" containerID="749036d8bf1b77baee04cef17ccfa536e0d95170a2df1ac4c12fbd85141f1bc5" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.532223 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"749036d8bf1b77baee04cef17ccfa536e0d95170a2df1ac4c12fbd85141f1bc5"} err="failed to get container status \"749036d8bf1b77baee04cef17ccfa536e0d95170a2df1ac4c12fbd85141f1bc5\": rpc error: code = NotFound desc = could not find container \"749036d8bf1b77baee04cef17ccfa536e0d95170a2df1ac4c12fbd85141f1bc5\": container with ID starting with 749036d8bf1b77baee04cef17ccfa536e0d95170a2df1ac4c12fbd85141f1bc5 not found: ID does not exist" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.532249 5036 scope.go:117] "RemoveContainer" containerID="ba52c8e9f90b4494befcf61873a3c2b416685b5305ff3da8030cde082ef32a8b" Jan 10 16:46:53 crc kubenswrapper[5036]: E0110 16:46:53.532723 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba52c8e9f90b4494befcf61873a3c2b416685b5305ff3da8030cde082ef32a8b\": container with ID starting with ba52c8e9f90b4494befcf61873a3c2b416685b5305ff3da8030cde082ef32a8b not found: ID does not exist" containerID="ba52c8e9f90b4494befcf61873a3c2b416685b5305ff3da8030cde082ef32a8b" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.532788 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba52c8e9f90b4494befcf61873a3c2b416685b5305ff3da8030cde082ef32a8b"} err="failed to get container status \"ba52c8e9f90b4494befcf61873a3c2b416685b5305ff3da8030cde082ef32a8b\": rpc error: code = NotFound desc = could not find container \"ba52c8e9f90b4494befcf61873a3c2b416685b5305ff3da8030cde082ef32a8b\": container with ID starting with ba52c8e9f90b4494befcf61873a3c2b416685b5305ff3da8030cde082ef32a8b not found: ID does not exist" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.532811 5036 scope.go:117] "RemoveContainer" containerID="2419bc8ef43251b9a30022bd39e79403103ca47db5e1838f25d20ebc5af7b0ec" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.534546 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.544203 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a385be-22ab-421b-bdd1-8b45b4aaae47-combined-ca-bundle\") pod \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\" (UID: \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\") " Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.544321 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0a385be-22ab-421b-bdd1-8b45b4aaae47-logs\") pod \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\" (UID: \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\") " Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.544390 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a385be-22ab-421b-bdd1-8b45b4aaae47-config-data\") pod \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\" (UID: \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\") " Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.544519 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a385be-22ab-421b-bdd1-8b45b4aaae47-public-tls-certs\") pod \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\" (UID: \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\") " Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.544556 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a385be-22ab-421b-bdd1-8b45b4aaae47-internal-tls-certs\") pod \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\" (UID: \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\") " Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.544796 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmrj5\" (UniqueName: \"kubernetes.io/projected/e0a385be-22ab-421b-bdd1-8b45b4aaae47-kube-api-access-pmrj5\") pod \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\" (UID: \"e0a385be-22ab-421b-bdd1-8b45b4aaae47\") " Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.545396 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.545425 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqbs7\" (UniqueName: \"kubernetes.io/projected/2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa-kube-api-access-wqbs7\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.545440 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.546016 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0a385be-22ab-421b-bdd1-8b45b4aaae47-logs" (OuterVolumeSpecName: "logs") pod "e0a385be-22ab-421b-bdd1-8b45b4aaae47" (UID: "e0a385be-22ab-421b-bdd1-8b45b4aaae47"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.548403 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0a385be-22ab-421b-bdd1-8b45b4aaae47-kube-api-access-pmrj5" (OuterVolumeSpecName: "kube-api-access-pmrj5") pod "e0a385be-22ab-421b-bdd1-8b45b4aaae47" (UID: "e0a385be-22ab-421b-bdd1-8b45b4aaae47"). InnerVolumeSpecName "kube-api-access-pmrj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.559518 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 10 16:46:53 crc kubenswrapper[5036]: E0110 16:46:53.560015 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a385be-22ab-421b-bdd1-8b45b4aaae47" containerName="nova-api-log" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.560040 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a385be-22ab-421b-bdd1-8b45b4aaae47" containerName="nova-api-log" Jan 10 16:46:53 crc kubenswrapper[5036]: E0110 16:46:53.560072 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa" containerName="nova-scheduler-scheduler" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.560081 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa" containerName="nova-scheduler-scheduler" Jan 10 16:46:53 crc kubenswrapper[5036]: E0110 16:46:53.560102 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a385be-22ab-421b-bdd1-8b45b4aaae47" containerName="nova-api-api" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.560110 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a385be-22ab-421b-bdd1-8b45b4aaae47" containerName="nova-api-api" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.560332 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa" containerName="nova-scheduler-scheduler" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.560356 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0a385be-22ab-421b-bdd1-8b45b4aaae47" containerName="nova-api-log" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.560374 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0a385be-22ab-421b-bdd1-8b45b4aaae47" containerName="nova-api-api" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.561265 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.566005 5036 scope.go:117] "RemoveContainer" containerID="2419bc8ef43251b9a30022bd39e79403103ca47db5e1838f25d20ebc5af7b0ec" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.566139 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 10 16:46:53 crc kubenswrapper[5036]: E0110 16:46:53.566637 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2419bc8ef43251b9a30022bd39e79403103ca47db5e1838f25d20ebc5af7b0ec\": container with ID starting with 2419bc8ef43251b9a30022bd39e79403103ca47db5e1838f25d20ebc5af7b0ec not found: ID does not exist" containerID="2419bc8ef43251b9a30022bd39e79403103ca47db5e1838f25d20ebc5af7b0ec" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.566669 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2419bc8ef43251b9a30022bd39e79403103ca47db5e1838f25d20ebc5af7b0ec"} err="failed to get container status \"2419bc8ef43251b9a30022bd39e79403103ca47db5e1838f25d20ebc5af7b0ec\": rpc error: code = NotFound desc = could not find container \"2419bc8ef43251b9a30022bd39e79403103ca47db5e1838f25d20ebc5af7b0ec\": container with ID starting with 2419bc8ef43251b9a30022bd39e79403103ca47db5e1838f25d20ebc5af7b0ec not found: ID does not exist" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.572620 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a385be-22ab-421b-bdd1-8b45b4aaae47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0a385be-22ab-421b-bdd1-8b45b4aaae47" (UID: "e0a385be-22ab-421b-bdd1-8b45b4aaae47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.574555 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.582735 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a385be-22ab-421b-bdd1-8b45b4aaae47-config-data" (OuterVolumeSpecName: "config-data") pod "e0a385be-22ab-421b-bdd1-8b45b4aaae47" (UID: "e0a385be-22ab-421b-bdd1-8b45b4aaae47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.612420 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a385be-22ab-421b-bdd1-8b45b4aaae47-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e0a385be-22ab-421b-bdd1-8b45b4aaae47" (UID: "e0a385be-22ab-421b-bdd1-8b45b4aaae47"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.614408 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a385be-22ab-421b-bdd1-8b45b4aaae47-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e0a385be-22ab-421b-bdd1-8b45b4aaae47" (UID: "e0a385be-22ab-421b-bdd1-8b45b4aaae47"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.647010 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a385be-22ab-421b-bdd1-8b45b4aaae47-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.647049 5036 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a385be-22ab-421b-bdd1-8b45b4aaae47-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.647062 5036 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0a385be-22ab-421b-bdd1-8b45b4aaae47-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.647074 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmrj5\" (UniqueName: \"kubernetes.io/projected/e0a385be-22ab-421b-bdd1-8b45b4aaae47-kube-api-access-pmrj5\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.647085 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a385be-22ab-421b-bdd1-8b45b4aaae47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.647099 5036 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0a385be-22ab-421b-bdd1-8b45b4aaae47-logs\") on node \"crc\" DevicePath \"\"" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.748731 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfl2l\" (UniqueName: \"kubernetes.io/projected/50d7fbd5-136f-4138-b4de-7d0841e80688-kube-api-access-rfl2l\") pod \"nova-scheduler-0\" (UID: \"50d7fbd5-136f-4138-b4de-7d0841e80688\") " pod="openstack/nova-scheduler-0" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.748857 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50d7fbd5-136f-4138-b4de-7d0841e80688-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"50d7fbd5-136f-4138-b4de-7d0841e80688\") " pod="openstack/nova-scheduler-0" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.748954 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50d7fbd5-136f-4138-b4de-7d0841e80688-config-data\") pod \"nova-scheduler-0\" (UID: \"50d7fbd5-136f-4138-b4de-7d0841e80688\") " pod="openstack/nova-scheduler-0" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.815713 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.824908 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.835910 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.838063 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.840802 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.841047 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.841201 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.849986 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfl2l\" (UniqueName: \"kubernetes.io/projected/50d7fbd5-136f-4138-b4de-7d0841e80688-kube-api-access-rfl2l\") pod \"nova-scheduler-0\" (UID: \"50d7fbd5-136f-4138-b4de-7d0841e80688\") " pod="openstack/nova-scheduler-0" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.850087 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50d7fbd5-136f-4138-b4de-7d0841e80688-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"50d7fbd5-136f-4138-b4de-7d0841e80688\") " pod="openstack/nova-scheduler-0" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.850208 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50d7fbd5-136f-4138-b4de-7d0841e80688-config-data\") pod \"nova-scheduler-0\" (UID: \"50d7fbd5-136f-4138-b4de-7d0841e80688\") " pod="openstack/nova-scheduler-0" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.854260 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50d7fbd5-136f-4138-b4de-7d0841e80688-config-data\") pod \"nova-scheduler-0\" (UID: \"50d7fbd5-136f-4138-b4de-7d0841e80688\") " pod="openstack/nova-scheduler-0" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.854525 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50d7fbd5-136f-4138-b4de-7d0841e80688-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"50d7fbd5-136f-4138-b4de-7d0841e80688\") " pod="openstack/nova-scheduler-0" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.873669 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.877214 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfl2l\" (UniqueName: \"kubernetes.io/projected/50d7fbd5-136f-4138-b4de-7d0841e80688-kube-api-access-rfl2l\") pod \"nova-scheduler-0\" (UID: \"50d7fbd5-136f-4138-b4de-7d0841e80688\") " pod="openstack/nova-scheduler-0" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.892021 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.951803 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf23453-9366-4458-9e7c-af60e7ef7b83-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3cf23453-9366-4458-9e7c-af60e7ef7b83\") " pod="openstack/nova-api-0" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.951877 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf23453-9366-4458-9e7c-af60e7ef7b83-config-data\") pod \"nova-api-0\" (UID: \"3cf23453-9366-4458-9e7c-af60e7ef7b83\") " pod="openstack/nova-api-0" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.951956 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cf23453-9366-4458-9e7c-af60e7ef7b83-public-tls-certs\") pod \"nova-api-0\" (UID: \"3cf23453-9366-4458-9e7c-af60e7ef7b83\") " pod="openstack/nova-api-0" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.952015 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cf23453-9366-4458-9e7c-af60e7ef7b83-logs\") pod \"nova-api-0\" (UID: \"3cf23453-9366-4458-9e7c-af60e7ef7b83\") " pod="openstack/nova-api-0" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.952034 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf9mg\" (UniqueName: \"kubernetes.io/projected/3cf23453-9366-4458-9e7c-af60e7ef7b83-kube-api-access-vf9mg\") pod \"nova-api-0\" (UID: \"3cf23453-9366-4458-9e7c-af60e7ef7b83\") " pod="openstack/nova-api-0" Jan 10 16:46:53 crc kubenswrapper[5036]: I0110 16:46:53.952052 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cf23453-9366-4458-9e7c-af60e7ef7b83-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3cf23453-9366-4458-9e7c-af60e7ef7b83\") " pod="openstack/nova-api-0" Jan 10 16:46:54 crc kubenswrapper[5036]: I0110 16:46:54.053903 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf23453-9366-4458-9e7c-af60e7ef7b83-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3cf23453-9366-4458-9e7c-af60e7ef7b83\") " pod="openstack/nova-api-0" Jan 10 16:46:54 crc kubenswrapper[5036]: I0110 16:46:54.054584 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf23453-9366-4458-9e7c-af60e7ef7b83-config-data\") pod \"nova-api-0\" (UID: \"3cf23453-9366-4458-9e7c-af60e7ef7b83\") " pod="openstack/nova-api-0" Jan 10 16:46:54 crc kubenswrapper[5036]: I0110 16:46:54.054695 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cf23453-9366-4458-9e7c-af60e7ef7b83-public-tls-certs\") pod \"nova-api-0\" (UID: \"3cf23453-9366-4458-9e7c-af60e7ef7b83\") " pod="openstack/nova-api-0" Jan 10 16:46:54 crc kubenswrapper[5036]: I0110 16:46:54.054813 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cf23453-9366-4458-9e7c-af60e7ef7b83-logs\") pod \"nova-api-0\" (UID: \"3cf23453-9366-4458-9e7c-af60e7ef7b83\") " pod="openstack/nova-api-0" Jan 10 16:46:54 crc kubenswrapper[5036]: I0110 16:46:54.054845 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf9mg\" (UniqueName: \"kubernetes.io/projected/3cf23453-9366-4458-9e7c-af60e7ef7b83-kube-api-access-vf9mg\") pod \"nova-api-0\" (UID: \"3cf23453-9366-4458-9e7c-af60e7ef7b83\") " pod="openstack/nova-api-0" Jan 10 16:46:54 crc kubenswrapper[5036]: I0110 16:46:54.054873 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cf23453-9366-4458-9e7c-af60e7ef7b83-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3cf23453-9366-4458-9e7c-af60e7ef7b83\") " pod="openstack/nova-api-0" Jan 10 16:46:54 crc kubenswrapper[5036]: I0110 16:46:54.055130 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cf23453-9366-4458-9e7c-af60e7ef7b83-logs\") pod \"nova-api-0\" (UID: \"3cf23453-9366-4458-9e7c-af60e7ef7b83\") " pod="openstack/nova-api-0" Jan 10 16:46:54 crc kubenswrapper[5036]: I0110 16:46:54.058838 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf23453-9366-4458-9e7c-af60e7ef7b83-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3cf23453-9366-4458-9e7c-af60e7ef7b83\") " pod="openstack/nova-api-0" Jan 10 16:46:54 crc kubenswrapper[5036]: I0110 16:46:54.059340 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf23453-9366-4458-9e7c-af60e7ef7b83-config-data\") pod \"nova-api-0\" (UID: \"3cf23453-9366-4458-9e7c-af60e7ef7b83\") " pod="openstack/nova-api-0" Jan 10 16:46:54 crc kubenswrapper[5036]: I0110 16:46:54.060610 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cf23453-9366-4458-9e7c-af60e7ef7b83-public-tls-certs\") pod \"nova-api-0\" (UID: \"3cf23453-9366-4458-9e7c-af60e7ef7b83\") " pod="openstack/nova-api-0" Jan 10 16:46:54 crc kubenswrapper[5036]: I0110 16:46:54.061282 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cf23453-9366-4458-9e7c-af60e7ef7b83-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3cf23453-9366-4458-9e7c-af60e7ef7b83\") " pod="openstack/nova-api-0" Jan 10 16:46:54 crc kubenswrapper[5036]: I0110 16:46:54.073360 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf9mg\" (UniqueName: \"kubernetes.io/projected/3cf23453-9366-4458-9e7c-af60e7ef7b83-kube-api-access-vf9mg\") pod \"nova-api-0\" (UID: \"3cf23453-9366-4458-9e7c-af60e7ef7b83\") " pod="openstack/nova-api-0" Jan 10 16:46:54 crc kubenswrapper[5036]: I0110 16:46:54.307179 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 10 16:46:54 crc kubenswrapper[5036]: W0110 16:46:54.332503 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50d7fbd5_136f_4138_b4de_7d0841e80688.slice/crio-b582dd84af880866f90ae0082071bf7ef7d80a5763d6fd7c2c83c234716d1fde WatchSource:0}: Error finding container b582dd84af880866f90ae0082071bf7ef7d80a5763d6fd7c2c83c234716d1fde: Status 404 returned error can't find the container with id b582dd84af880866f90ae0082071bf7ef7d80a5763d6fd7c2c83c234716d1fde Jan 10 16:46:54 crc kubenswrapper[5036]: I0110 16:46:54.332806 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 10 16:46:54 crc kubenswrapper[5036]: I0110 16:46:54.488298 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"50d7fbd5-136f-4138-b4de-7d0841e80688","Type":"ContainerStarted","Data":"b582dd84af880866f90ae0082071bf7ef7d80a5763d6fd7c2c83c234716d1fde"} Jan 10 16:46:54 crc kubenswrapper[5036]: I0110 16:46:54.522023 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa" path="/var/lib/kubelet/pods/2a207ae0-2b70-4af3-b5df-82d5a9f8c3fa/volumes" Jan 10 16:46:54 crc kubenswrapper[5036]: I0110 16:46:54.522767 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0a385be-22ab-421b-bdd1-8b45b4aaae47" path="/var/lib/kubelet/pods/e0a385be-22ab-421b-bdd1-8b45b4aaae47/volumes" Jan 10 16:46:54 crc kubenswrapper[5036]: I0110 16:46:54.742863 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 10 16:46:54 crc kubenswrapper[5036]: W0110 16:46:54.747550 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cf23453_9366_4458_9e7c_af60e7ef7b83.slice/crio-b500167bcb3f108328f8e1c349083f73d1e6e3a0eab3a1bfef61cd967c62171f WatchSource:0}: Error finding container b500167bcb3f108328f8e1c349083f73d1e6e3a0eab3a1bfef61cd967c62171f: Status 404 returned error can't find the container with id b500167bcb3f108328f8e1c349083f73d1e6e3a0eab3a1bfef61cd967c62171f Jan 10 16:46:55 crc kubenswrapper[5036]: I0110 16:46:55.498776 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3cf23453-9366-4458-9e7c-af60e7ef7b83","Type":"ContainerStarted","Data":"6b8bf3aac5f5ae316cb7b79a612699964efb61c927495b1097747da1f92dc65d"} Jan 10 16:46:55 crc kubenswrapper[5036]: I0110 16:46:55.499120 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3cf23453-9366-4458-9e7c-af60e7ef7b83","Type":"ContainerStarted","Data":"e231c48896afb6d517903c5678da1976c1c7aedc28772f289a043697ea3ff47a"} Jan 10 16:46:55 crc kubenswrapper[5036]: I0110 16:46:55.499134 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3cf23453-9366-4458-9e7c-af60e7ef7b83","Type":"ContainerStarted","Data":"b500167bcb3f108328f8e1c349083f73d1e6e3a0eab3a1bfef61cd967c62171f"} Jan 10 16:46:55 crc kubenswrapper[5036]: I0110 16:46:55.500453 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"50d7fbd5-136f-4138-b4de-7d0841e80688","Type":"ContainerStarted","Data":"5264203d3b8f01045bb413f865e0b488368439ae5d4debd7178f6c44d66531d1"} Jan 10 16:46:55 crc kubenswrapper[5036]: I0110 16:46:55.522161 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.522131993 podStartE2EDuration="2.522131993s" podCreationTimestamp="2026-01-10 16:46:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:46:55.520698333 +0000 UTC m=+1137.390933837" watchObservedRunningTime="2026-01-10 16:46:55.522131993 +0000 UTC m=+1137.392367527" Jan 10 16:46:55 crc kubenswrapper[5036]: I0110 16:46:55.544095 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.544078497 podStartE2EDuration="2.544078497s" podCreationTimestamp="2026-01-10 16:46:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:46:55.535996597 +0000 UTC m=+1137.406232111" watchObservedRunningTime="2026-01-10 16:46:55.544078497 +0000 UTC m=+1137.414313991" Jan 10 16:46:56 crc kubenswrapper[5036]: I0110 16:46:56.862935 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 10 16:46:56 crc kubenswrapper[5036]: I0110 16:46:56.863259 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 10 16:46:58 crc kubenswrapper[5036]: I0110 16:46:58.893312 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 10 16:47:01 crc kubenswrapper[5036]: I0110 16:47:01.862603 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 10 16:47:01 crc kubenswrapper[5036]: I0110 16:47:01.863105 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 10 16:47:02 crc kubenswrapper[5036]: I0110 16:47:02.884105 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f2dade9a-7926-4c9b-82df-4c525efd69db" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.187:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 10 16:47:02 crc kubenswrapper[5036]: I0110 16:47:02.884100 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f2dade9a-7926-4c9b-82df-4c525efd69db" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.187:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 10 16:47:03 crc kubenswrapper[5036]: I0110 16:47:03.892350 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 10 16:47:03 crc kubenswrapper[5036]: I0110 16:47:03.924784 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 10 16:47:04 crc kubenswrapper[5036]: I0110 16:47:04.308878 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 10 16:47:04 crc kubenswrapper[5036]: I0110 16:47:04.308960 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 10 16:47:04 crc kubenswrapper[5036]: I0110 16:47:04.619822 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 10 16:47:04 crc kubenswrapper[5036]: I0110 16:47:04.634848 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 10 16:47:05 crc kubenswrapper[5036]: I0110 16:47:05.338907 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3cf23453-9366-4458-9e7c-af60e7ef7b83" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.189:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 10 16:47:05 crc kubenswrapper[5036]: I0110 16:47:05.338974 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3cf23453-9366-4458-9e7c-af60e7ef7b83" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.189:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 10 16:47:11 crc kubenswrapper[5036]: I0110 16:47:11.869103 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 10 16:47:11 crc kubenswrapper[5036]: I0110 16:47:11.870413 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 10 16:47:11 crc kubenswrapper[5036]: I0110 16:47:11.879221 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 10 16:47:11 crc kubenswrapper[5036]: I0110 16:47:11.881361 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 10 16:47:14 crc kubenswrapper[5036]: I0110 16:47:14.317785 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 10 16:47:14 crc kubenswrapper[5036]: I0110 16:47:14.318239 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 10 16:47:14 crc kubenswrapper[5036]: I0110 16:47:14.318458 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 10 16:47:14 crc kubenswrapper[5036]: I0110 16:47:14.318509 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 10 16:47:14 crc kubenswrapper[5036]: I0110 16:47:14.326468 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 10 16:47:14 crc kubenswrapper[5036]: I0110 16:47:14.329623 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 10 16:47:22 crc kubenswrapper[5036]: I0110 16:47:22.635977 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 10 16:47:24 crc kubenswrapper[5036]: I0110 16:47:24.130443 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 10 16:47:27 crc kubenswrapper[5036]: I0110 16:47:27.024675 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="cd708bfb-a557-401f-b815-16d584c8eb78" containerName="rabbitmq" containerID="cri-o://e4b148b346076aa377d8d88ff1b23fc00085057346777410276419385187b299" gracePeriod=604796 Jan 10 16:47:28 crc kubenswrapper[5036]: I0110 16:47:28.873568 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="8146d758-62d6-4640-86f8-51b89a8a8519" containerName="rabbitmq" containerID="cri-o://f99399c64a79a6f43277fe41635e3723399ec561b2e34ae459cf787a81c219b7" gracePeriod=604796 Jan 10 16:47:32 crc kubenswrapper[5036]: I0110 16:47:32.774403 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="cd708bfb-a557-401f-b815-16d584c8eb78" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.202856 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="8146d758-62d6-4640-86f8-51b89a8a8519" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.736834 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.854717 5036 generic.go:334] "Generic (PLEG): container finished" podID="cd708bfb-a557-401f-b815-16d584c8eb78" containerID="e4b148b346076aa377d8d88ff1b23fc00085057346777410276419385187b299" exitCode=0 Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.854769 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cd708bfb-a557-401f-b815-16d584c8eb78","Type":"ContainerDied","Data":"e4b148b346076aa377d8d88ff1b23fc00085057346777410276419385187b299"} Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.854796 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cd708bfb-a557-401f-b815-16d584c8eb78","Type":"ContainerDied","Data":"7d08dbf4df089f307d08c2f62ea15f05e60efd7b821c66f6c644139bf604a74c"} Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.854793 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.854814 5036 scope.go:117] "RemoveContainer" containerID="e4b148b346076aa377d8d88ff1b23fc00085057346777410276419385187b299" Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.886270 5036 scope.go:117] "RemoveContainer" containerID="40a904c4742ed367fe558a46a911f8146836480ee5820dd2aeb7d14fee4a18f4" Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.898550 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cd708bfb-a557-401f-b815-16d584c8eb78-plugins-conf\") pod \"cd708bfb-a557-401f-b815-16d584c8eb78\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.898595 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cd708bfb-a557-401f-b815-16d584c8eb78-erlang-cookie-secret\") pod \"cd708bfb-a557-401f-b815-16d584c8eb78\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.898658 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cd708bfb-a557-401f-b815-16d584c8eb78-rabbitmq-plugins\") pod \"cd708bfb-a557-401f-b815-16d584c8eb78\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.898836 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cd708bfb-a557-401f-b815-16d584c8eb78\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.899364 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cd708bfb-a557-401f-b815-16d584c8eb78-rabbitmq-tls\") pod \"cd708bfb-a557-401f-b815-16d584c8eb78\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.899633 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnwrx\" (UniqueName: \"kubernetes.io/projected/cd708bfb-a557-401f-b815-16d584c8eb78-kube-api-access-rnwrx\") pod \"cd708bfb-a557-401f-b815-16d584c8eb78\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.899638 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd708bfb-a557-401f-b815-16d584c8eb78-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "cd708bfb-a557-401f-b815-16d584c8eb78" (UID: "cd708bfb-a557-401f-b815-16d584c8eb78"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.899894 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd708bfb-a557-401f-b815-16d584c8eb78-config-data\") pod \"cd708bfb-a557-401f-b815-16d584c8eb78\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.900047 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cd708bfb-a557-401f-b815-16d584c8eb78-rabbitmq-erlang-cookie\") pod \"cd708bfb-a557-401f-b815-16d584c8eb78\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.900073 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cd708bfb-a557-401f-b815-16d584c8eb78-pod-info\") pod \"cd708bfb-a557-401f-b815-16d584c8eb78\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.900233 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cd708bfb-a557-401f-b815-16d584c8eb78-server-conf\") pod \"cd708bfb-a557-401f-b815-16d584c8eb78\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.900335 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cd708bfb-a557-401f-b815-16d584c8eb78-rabbitmq-confd\") pod \"cd708bfb-a557-401f-b815-16d584c8eb78\" (UID: \"cd708bfb-a557-401f-b815-16d584c8eb78\") " Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.900344 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd708bfb-a557-401f-b815-16d584c8eb78-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "cd708bfb-a557-401f-b815-16d584c8eb78" (UID: "cd708bfb-a557-401f-b815-16d584c8eb78"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.901137 5036 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cd708bfb-a557-401f-b815-16d584c8eb78-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.902285 5036 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cd708bfb-a557-401f-b815-16d584c8eb78-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.902998 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd708bfb-a557-401f-b815-16d584c8eb78-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "cd708bfb-a557-401f-b815-16d584c8eb78" (UID: "cd708bfb-a557-401f-b815-16d584c8eb78"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.905902 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd708bfb-a557-401f-b815-16d584c8eb78-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "cd708bfb-a557-401f-b815-16d584c8eb78" (UID: "cd708bfb-a557-401f-b815-16d584c8eb78"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.907344 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/cd708bfb-a557-401f-b815-16d584c8eb78-pod-info" (OuterVolumeSpecName: "pod-info") pod "cd708bfb-a557-401f-b815-16d584c8eb78" (UID: "cd708bfb-a557-401f-b815-16d584c8eb78"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.907867 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd708bfb-a557-401f-b815-16d584c8eb78-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "cd708bfb-a557-401f-b815-16d584c8eb78" (UID: "cd708bfb-a557-401f-b815-16d584c8eb78"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.908498 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd708bfb-a557-401f-b815-16d584c8eb78-kube-api-access-rnwrx" (OuterVolumeSpecName: "kube-api-access-rnwrx") pod "cd708bfb-a557-401f-b815-16d584c8eb78" (UID: "cd708bfb-a557-401f-b815-16d584c8eb78"). InnerVolumeSpecName "kube-api-access-rnwrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.908591 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "cd708bfb-a557-401f-b815-16d584c8eb78" (UID: "cd708bfb-a557-401f-b815-16d584c8eb78"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.914470 5036 scope.go:117] "RemoveContainer" containerID="e4b148b346076aa377d8d88ff1b23fc00085057346777410276419385187b299" Jan 10 16:47:33 crc kubenswrapper[5036]: E0110 16:47:33.915139 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4b148b346076aa377d8d88ff1b23fc00085057346777410276419385187b299\": container with ID starting with e4b148b346076aa377d8d88ff1b23fc00085057346777410276419385187b299 not found: ID does not exist" containerID="e4b148b346076aa377d8d88ff1b23fc00085057346777410276419385187b299" Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.915172 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4b148b346076aa377d8d88ff1b23fc00085057346777410276419385187b299"} err="failed to get container status \"e4b148b346076aa377d8d88ff1b23fc00085057346777410276419385187b299\": rpc error: code = NotFound desc = could not find container \"e4b148b346076aa377d8d88ff1b23fc00085057346777410276419385187b299\": container with ID starting with e4b148b346076aa377d8d88ff1b23fc00085057346777410276419385187b299 not found: ID does not exist" Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.915197 5036 scope.go:117] "RemoveContainer" containerID="40a904c4742ed367fe558a46a911f8146836480ee5820dd2aeb7d14fee4a18f4" Jan 10 16:47:33 crc kubenswrapper[5036]: E0110 16:47:33.915475 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40a904c4742ed367fe558a46a911f8146836480ee5820dd2aeb7d14fee4a18f4\": container with ID starting with 40a904c4742ed367fe558a46a911f8146836480ee5820dd2aeb7d14fee4a18f4 not found: ID does not exist" containerID="40a904c4742ed367fe558a46a911f8146836480ee5820dd2aeb7d14fee4a18f4" Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.915493 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40a904c4742ed367fe558a46a911f8146836480ee5820dd2aeb7d14fee4a18f4"} err="failed to get container status \"40a904c4742ed367fe558a46a911f8146836480ee5820dd2aeb7d14fee4a18f4\": rpc error: code = NotFound desc = could not find container \"40a904c4742ed367fe558a46a911f8146836480ee5820dd2aeb7d14fee4a18f4\": container with ID starting with 40a904c4742ed367fe558a46a911f8146836480ee5820dd2aeb7d14fee4a18f4 not found: ID does not exist" Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.927696 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd708bfb-a557-401f-b815-16d584c8eb78-config-data" (OuterVolumeSpecName: "config-data") pod "cd708bfb-a557-401f-b815-16d584c8eb78" (UID: "cd708bfb-a557-401f-b815-16d584c8eb78"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:47:33 crc kubenswrapper[5036]: I0110 16:47:33.959918 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd708bfb-a557-401f-b815-16d584c8eb78-server-conf" (OuterVolumeSpecName: "server-conf") pod "cd708bfb-a557-401f-b815-16d584c8eb78" (UID: "cd708bfb-a557-401f-b815-16d584c8eb78"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.004472 5036 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.004503 5036 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cd708bfb-a557-401f-b815-16d584c8eb78-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.004513 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnwrx\" (UniqueName: \"kubernetes.io/projected/cd708bfb-a557-401f-b815-16d584c8eb78-kube-api-access-rnwrx\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.004523 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd708bfb-a557-401f-b815-16d584c8eb78-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.004532 5036 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cd708bfb-a557-401f-b815-16d584c8eb78-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.004542 5036 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cd708bfb-a557-401f-b815-16d584c8eb78-pod-info\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.004550 5036 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cd708bfb-a557-401f-b815-16d584c8eb78-server-conf\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.004578 5036 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cd708bfb-a557-401f-b815-16d584c8eb78-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.005074 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd708bfb-a557-401f-b815-16d584c8eb78-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "cd708bfb-a557-401f-b815-16d584c8eb78" (UID: "cd708bfb-a557-401f-b815-16d584c8eb78"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.026244 5036 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.106652 5036 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cd708bfb-a557-401f-b815-16d584c8eb78-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.106912 5036 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.198081 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.208868 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.216523 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 10 16:47:34 crc kubenswrapper[5036]: E0110 16:47:34.216954 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd708bfb-a557-401f-b815-16d584c8eb78" containerName="rabbitmq" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.216978 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd708bfb-a557-401f-b815-16d584c8eb78" containerName="rabbitmq" Jan 10 16:47:34 crc kubenswrapper[5036]: E0110 16:47:34.217022 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd708bfb-a557-401f-b815-16d584c8eb78" containerName="setup-container" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.217030 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd708bfb-a557-401f-b815-16d584c8eb78" containerName="setup-container" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.217214 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd708bfb-a557-401f-b815-16d584c8eb78" containerName="rabbitmq" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.218537 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.221183 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.221219 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-b7v4x" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.221292 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.221304 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.221348 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.221365 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.221413 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.235931 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.412765 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e33d0131-d1d9-42cb-b772-7fe9835cee44-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.413574 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.413612 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e33d0131-d1d9-42cb-b772-7fe9835cee44-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.413642 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e33d0131-d1d9-42cb-b772-7fe9835cee44-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.413662 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e33d0131-d1d9-42cb-b772-7fe9835cee44-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.413703 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e33d0131-d1d9-42cb-b772-7fe9835cee44-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.413725 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dl7x\" (UniqueName: \"kubernetes.io/projected/e33d0131-d1d9-42cb-b772-7fe9835cee44-kube-api-access-9dl7x\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.413764 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e33d0131-d1d9-42cb-b772-7fe9835cee44-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.413816 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e33d0131-d1d9-42cb-b772-7fe9835cee44-config-data\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.413842 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e33d0131-d1d9-42cb-b772-7fe9835cee44-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.413881 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e33d0131-d1d9-42cb-b772-7fe9835cee44-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.514792 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.515054 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e33d0131-d1d9-42cb-b772-7fe9835cee44-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.515186 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e33d0131-d1d9-42cb-b772-7fe9835cee44-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.515316 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e33d0131-d1d9-42cb-b772-7fe9835cee44-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.515414 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e33d0131-d1d9-42cb-b772-7fe9835cee44-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.515539 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dl7x\" (UniqueName: \"kubernetes.io/projected/e33d0131-d1d9-42cb-b772-7fe9835cee44-kube-api-access-9dl7x\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.515652 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e33d0131-d1d9-42cb-b772-7fe9835cee44-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.515795 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e33d0131-d1d9-42cb-b772-7fe9835cee44-config-data\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.515249 5036 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.515965 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e33d0131-d1d9-42cb-b772-7fe9835cee44-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.517037 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e33d0131-d1d9-42cb-b772-7fe9835cee44-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.518889 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e33d0131-d1d9-42cb-b772-7fe9835cee44-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.518908 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e33d0131-d1d9-42cb-b772-7fe9835cee44-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.519431 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e33d0131-d1d9-42cb-b772-7fe9835cee44-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.519605 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd708bfb-a557-401f-b815-16d584c8eb78" path="/var/lib/kubelet/pods/cd708bfb-a557-401f-b815-16d584c8eb78/volumes" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.519946 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e33d0131-d1d9-42cb-b772-7fe9835cee44-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.520183 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e33d0131-d1d9-42cb-b772-7fe9835cee44-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.520505 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e33d0131-d1d9-42cb-b772-7fe9835cee44-config-data\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.524523 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e33d0131-d1d9-42cb-b772-7fe9835cee44-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.525613 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e33d0131-d1d9-42cb-b772-7fe9835cee44-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.527411 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e33d0131-d1d9-42cb-b772-7fe9835cee44-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.531420 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e33d0131-d1d9-42cb-b772-7fe9835cee44-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.535369 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dl7x\" (UniqueName: \"kubernetes.io/projected/e33d0131-d1d9-42cb-b772-7fe9835cee44-kube-api-access-9dl7x\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.555531 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"e33d0131-d1d9-42cb-b772-7fe9835cee44\") " pod="openstack/rabbitmq-server-0" Jan 10 16:47:34 crc kubenswrapper[5036]: I0110 16:47:34.844545 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.292196 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.507854 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.646145 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8146d758-62d6-4640-86f8-51b89a8a8519-pod-info\") pod \"8146d758-62d6-4640-86f8-51b89a8a8519\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.646534 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8146d758-62d6-4640-86f8-51b89a8a8519-rabbitmq-confd\") pod \"8146d758-62d6-4640-86f8-51b89a8a8519\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.646594 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8146d758-62d6-4640-86f8-51b89a8a8519-rabbitmq-tls\") pod \"8146d758-62d6-4640-86f8-51b89a8a8519\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.646624 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8146d758-62d6-4640-86f8-51b89a8a8519-rabbitmq-plugins\") pod \"8146d758-62d6-4640-86f8-51b89a8a8519\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.646654 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8146d758-62d6-4640-86f8-51b89a8a8519-erlang-cookie-secret\") pod \"8146d758-62d6-4640-86f8-51b89a8a8519\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.646673 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mllbt\" (UniqueName: \"kubernetes.io/projected/8146d758-62d6-4640-86f8-51b89a8a8519-kube-api-access-mllbt\") pod \"8146d758-62d6-4640-86f8-51b89a8a8519\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.646702 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"8146d758-62d6-4640-86f8-51b89a8a8519\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.646781 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8146d758-62d6-4640-86f8-51b89a8a8519-server-conf\") pod \"8146d758-62d6-4640-86f8-51b89a8a8519\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.646849 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8146d758-62d6-4640-86f8-51b89a8a8519-config-data\") pod \"8146d758-62d6-4640-86f8-51b89a8a8519\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.646878 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8146d758-62d6-4640-86f8-51b89a8a8519-plugins-conf\") pod \"8146d758-62d6-4640-86f8-51b89a8a8519\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.646917 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8146d758-62d6-4640-86f8-51b89a8a8519-rabbitmq-erlang-cookie\") pod \"8146d758-62d6-4640-86f8-51b89a8a8519\" (UID: \"8146d758-62d6-4640-86f8-51b89a8a8519\") " Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.647897 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8146d758-62d6-4640-86f8-51b89a8a8519-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "8146d758-62d6-4640-86f8-51b89a8a8519" (UID: "8146d758-62d6-4640-86f8-51b89a8a8519"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.647953 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8146d758-62d6-4640-86f8-51b89a8a8519-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "8146d758-62d6-4640-86f8-51b89a8a8519" (UID: "8146d758-62d6-4640-86f8-51b89a8a8519"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.648118 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8146d758-62d6-4640-86f8-51b89a8a8519-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "8146d758-62d6-4640-86f8-51b89a8a8519" (UID: "8146d758-62d6-4640-86f8-51b89a8a8519"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.652260 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/8146d758-62d6-4640-86f8-51b89a8a8519-pod-info" (OuterVolumeSpecName: "pod-info") pod "8146d758-62d6-4640-86f8-51b89a8a8519" (UID: "8146d758-62d6-4640-86f8-51b89a8a8519"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.656322 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "8146d758-62d6-4640-86f8-51b89a8a8519" (UID: "8146d758-62d6-4640-86f8-51b89a8a8519"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.656371 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8146d758-62d6-4640-86f8-51b89a8a8519-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "8146d758-62d6-4640-86f8-51b89a8a8519" (UID: "8146d758-62d6-4640-86f8-51b89a8a8519"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.656450 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8146d758-62d6-4640-86f8-51b89a8a8519-kube-api-access-mllbt" (OuterVolumeSpecName: "kube-api-access-mllbt") pod "8146d758-62d6-4640-86f8-51b89a8a8519" (UID: "8146d758-62d6-4640-86f8-51b89a8a8519"). InnerVolumeSpecName "kube-api-access-mllbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.668818 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8146d758-62d6-4640-86f8-51b89a8a8519-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "8146d758-62d6-4640-86f8-51b89a8a8519" (UID: "8146d758-62d6-4640-86f8-51b89a8a8519"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.690971 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8146d758-62d6-4640-86f8-51b89a8a8519-config-data" (OuterVolumeSpecName: "config-data") pod "8146d758-62d6-4640-86f8-51b89a8a8519" (UID: "8146d758-62d6-4640-86f8-51b89a8a8519"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.722875 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8146d758-62d6-4640-86f8-51b89a8a8519-server-conf" (OuterVolumeSpecName: "server-conf") pod "8146d758-62d6-4640-86f8-51b89a8a8519" (UID: "8146d758-62d6-4640-86f8-51b89a8a8519"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.748466 5036 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8146d758-62d6-4640-86f8-51b89a8a8519-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.748510 5036 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8146d758-62d6-4640-86f8-51b89a8a8519-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.748525 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mllbt\" (UniqueName: \"kubernetes.io/projected/8146d758-62d6-4640-86f8-51b89a8a8519-kube-api-access-mllbt\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.748559 5036 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.748573 5036 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8146d758-62d6-4640-86f8-51b89a8a8519-server-conf\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.748585 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8146d758-62d6-4640-86f8-51b89a8a8519-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.748596 5036 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8146d758-62d6-4640-86f8-51b89a8a8519-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.748607 5036 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8146d758-62d6-4640-86f8-51b89a8a8519-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.748618 5036 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8146d758-62d6-4640-86f8-51b89a8a8519-pod-info\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.748628 5036 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8146d758-62d6-4640-86f8-51b89a8a8519-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.770918 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8146d758-62d6-4640-86f8-51b89a8a8519-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "8146d758-62d6-4640-86f8-51b89a8a8519" (UID: "8146d758-62d6-4640-86f8-51b89a8a8519"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.773951 5036 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.850472 5036 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8146d758-62d6-4640-86f8-51b89a8a8519-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.850526 5036 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.878471 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e33d0131-d1d9-42cb-b772-7fe9835cee44","Type":"ContainerStarted","Data":"afac497e27b093ebee5cbf8a8d46aad03ac18a2ea232dda6fc1783243212483f"} Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.880294 5036 generic.go:334] "Generic (PLEG): container finished" podID="8146d758-62d6-4640-86f8-51b89a8a8519" containerID="f99399c64a79a6f43277fe41635e3723399ec561b2e34ae459cf787a81c219b7" exitCode=0 Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.880339 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8146d758-62d6-4640-86f8-51b89a8a8519","Type":"ContainerDied","Data":"f99399c64a79a6f43277fe41635e3723399ec561b2e34ae459cf787a81c219b7"} Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.880401 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8146d758-62d6-4640-86f8-51b89a8a8519","Type":"ContainerDied","Data":"ec35042f1f0a13b49eeaf11f6a93f876702ecf50ec43890d2ac0ebdd3dc0d7c9"} Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.880397 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.880422 5036 scope.go:117] "RemoveContainer" containerID="f99399c64a79a6f43277fe41635e3723399ec561b2e34ae459cf787a81c219b7" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.902827 5036 scope.go:117] "RemoveContainer" containerID="140d035c5adeb766202b21920371d68bd36600fed05e2465e654516163e8857e" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.924831 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.930408 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.933195 5036 scope.go:117] "RemoveContainer" containerID="f99399c64a79a6f43277fe41635e3723399ec561b2e34ae459cf787a81c219b7" Jan 10 16:47:35 crc kubenswrapper[5036]: E0110 16:47:35.934656 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f99399c64a79a6f43277fe41635e3723399ec561b2e34ae459cf787a81c219b7\": container with ID starting with f99399c64a79a6f43277fe41635e3723399ec561b2e34ae459cf787a81c219b7 not found: ID does not exist" containerID="f99399c64a79a6f43277fe41635e3723399ec561b2e34ae459cf787a81c219b7" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.934762 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f99399c64a79a6f43277fe41635e3723399ec561b2e34ae459cf787a81c219b7"} err="failed to get container status \"f99399c64a79a6f43277fe41635e3723399ec561b2e34ae459cf787a81c219b7\": rpc error: code = NotFound desc = could not find container \"f99399c64a79a6f43277fe41635e3723399ec561b2e34ae459cf787a81c219b7\": container with ID starting with f99399c64a79a6f43277fe41635e3723399ec561b2e34ae459cf787a81c219b7 not found: ID does not exist" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.934826 5036 scope.go:117] "RemoveContainer" containerID="140d035c5adeb766202b21920371d68bd36600fed05e2465e654516163e8857e" Jan 10 16:47:35 crc kubenswrapper[5036]: E0110 16:47:35.935269 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"140d035c5adeb766202b21920371d68bd36600fed05e2465e654516163e8857e\": container with ID starting with 140d035c5adeb766202b21920371d68bd36600fed05e2465e654516163e8857e not found: ID does not exist" containerID="140d035c5adeb766202b21920371d68bd36600fed05e2465e654516163e8857e" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.935305 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"140d035c5adeb766202b21920371d68bd36600fed05e2465e654516163e8857e"} err="failed to get container status \"140d035c5adeb766202b21920371d68bd36600fed05e2465e654516163e8857e\": rpc error: code = NotFound desc = could not find container \"140d035c5adeb766202b21920371d68bd36600fed05e2465e654516163e8857e\": container with ID starting with 140d035c5adeb766202b21920371d68bd36600fed05e2465e654516163e8857e not found: ID does not exist" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.949124 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 10 16:47:35 crc kubenswrapper[5036]: E0110 16:47:35.950364 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8146d758-62d6-4640-86f8-51b89a8a8519" containerName="setup-container" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.950384 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="8146d758-62d6-4640-86f8-51b89a8a8519" containerName="setup-container" Jan 10 16:47:35 crc kubenswrapper[5036]: E0110 16:47:35.950403 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8146d758-62d6-4640-86f8-51b89a8a8519" containerName="rabbitmq" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.950410 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="8146d758-62d6-4640-86f8-51b89a8a8519" containerName="rabbitmq" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.950561 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="8146d758-62d6-4640-86f8-51b89a8a8519" containerName="rabbitmq" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.952466 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.955784 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.956050 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.956195 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.956332 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.956466 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.956785 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.970584 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-pdt5c" Jan 10 16:47:35 crc kubenswrapper[5036]: I0110 16:47:35.970790 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.155937 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.156013 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.156083 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.156117 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.156145 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzxp5\" (UniqueName: \"kubernetes.io/projected/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-kube-api-access-lzxp5\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.156496 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.156539 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.156585 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.156660 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.156706 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.157044 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.258463 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.258533 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.258553 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.258588 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.258607 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.258625 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzxp5\" (UniqueName: \"kubernetes.io/projected/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-kube-api-access-lzxp5\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.258648 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.258665 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.258707 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.258729 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.258747 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.258758 5036 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.259160 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.259481 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.260326 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.260426 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.261947 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.263096 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.263382 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.263720 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.271727 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.280772 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzxp5\" (UniqueName: \"kubernetes.io/projected/debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1-kube-api-access-lzxp5\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.322788 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.543282 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8146d758-62d6-4640-86f8-51b89a8a8519" path="/var/lib/kubelet/pods/8146d758-62d6-4640-86f8-51b89a8a8519/volumes" Jan 10 16:47:36 crc kubenswrapper[5036]: I0110 16:47:36.576764 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:47:37 crc kubenswrapper[5036]: W0110 16:47:37.096887 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddebd7e7e_7e74_43b6_b3d1_70ae0ee20dd1.slice/crio-f87e90790f259f122fd8c7b5b3f3aaf2a017e805cf02fa834de1ee6992db8bc0 WatchSource:0}: Error finding container f87e90790f259f122fd8c7b5b3f3aaf2a017e805cf02fa834de1ee6992db8bc0: Status 404 returned error can't find the container with id f87e90790f259f122fd8c7b5b3f3aaf2a017e805cf02fa834de1ee6992db8bc0 Jan 10 16:47:37 crc kubenswrapper[5036]: I0110 16:47:37.101126 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 10 16:47:37 crc kubenswrapper[5036]: I0110 16:47:37.905350 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1","Type":"ContainerStarted","Data":"f87e90790f259f122fd8c7b5b3f3aaf2a017e805cf02fa834de1ee6992db8bc0"} Jan 10 16:47:37 crc kubenswrapper[5036]: I0110 16:47:37.910088 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e33d0131-d1d9-42cb-b772-7fe9835cee44","Type":"ContainerStarted","Data":"7d3ca660e62e51fb3cbf78bd2774de704ec2720789ade7f4b2c5ed496e976acf"} Jan 10 16:47:39 crc kubenswrapper[5036]: I0110 16:47:39.411448 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2j4xd"] Jan 10 16:47:39 crc kubenswrapper[5036]: I0110 16:47:39.413919 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" Jan 10 16:47:39 crc kubenswrapper[5036]: I0110 16:47:39.416821 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Jan 10 16:47:39 crc kubenswrapper[5036]: I0110 16:47:39.421436 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2j4xd"] Jan 10 16:47:39 crc kubenswrapper[5036]: I0110 16:47:39.535612 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-2j4xd\" (UID: \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" Jan 10 16:47:39 crc kubenswrapper[5036]: I0110 16:47:39.535657 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmcd8\" (UniqueName: \"kubernetes.io/projected/ce527b2a-a6be-4f43-86b1-2b1497a081ca-kube-api-access-tmcd8\") pod \"dnsmasq-dns-6447ccbd8f-2j4xd\" (UID: \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" Jan 10 16:47:39 crc kubenswrapper[5036]: I0110 16:47:39.535710 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-2j4xd\" (UID: \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" Jan 10 16:47:39 crc kubenswrapper[5036]: I0110 16:47:39.535886 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-2j4xd\" (UID: \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" Jan 10 16:47:39 crc kubenswrapper[5036]: I0110 16:47:39.536048 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-2j4xd\" (UID: \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" Jan 10 16:47:39 crc kubenswrapper[5036]: I0110 16:47:39.536214 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-config\") pod \"dnsmasq-dns-6447ccbd8f-2j4xd\" (UID: \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" Jan 10 16:47:39 crc kubenswrapper[5036]: I0110 16:47:39.638111 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-2j4xd\" (UID: \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" Jan 10 16:47:39 crc kubenswrapper[5036]: I0110 16:47:39.638212 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmcd8\" (UniqueName: \"kubernetes.io/projected/ce527b2a-a6be-4f43-86b1-2b1497a081ca-kube-api-access-tmcd8\") pod \"dnsmasq-dns-6447ccbd8f-2j4xd\" (UID: \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" Jan 10 16:47:39 crc kubenswrapper[5036]: I0110 16:47:39.639001 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-2j4xd\" (UID: \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" Jan 10 16:47:39 crc kubenswrapper[5036]: I0110 16:47:39.639111 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-2j4xd\" (UID: \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" Jan 10 16:47:39 crc kubenswrapper[5036]: I0110 16:47:39.639160 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-2j4xd\" (UID: \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" Jan 10 16:47:39 crc kubenswrapper[5036]: I0110 16:47:39.639213 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-2j4xd\" (UID: \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" Jan 10 16:47:39 crc kubenswrapper[5036]: I0110 16:47:39.639239 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-config\") pod \"dnsmasq-dns-6447ccbd8f-2j4xd\" (UID: \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" Jan 10 16:47:39 crc kubenswrapper[5036]: I0110 16:47:39.640212 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-config\") pod \"dnsmasq-dns-6447ccbd8f-2j4xd\" (UID: \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" Jan 10 16:47:39 crc kubenswrapper[5036]: I0110 16:47:39.641103 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-2j4xd\" (UID: \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" Jan 10 16:47:39 crc kubenswrapper[5036]: I0110 16:47:39.641268 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-2j4xd\" (UID: \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" Jan 10 16:47:39 crc kubenswrapper[5036]: I0110 16:47:39.641826 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-2j4xd\" (UID: \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" Jan 10 16:47:39 crc kubenswrapper[5036]: I0110 16:47:39.675009 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmcd8\" (UniqueName: \"kubernetes.io/projected/ce527b2a-a6be-4f43-86b1-2b1497a081ca-kube-api-access-tmcd8\") pod \"dnsmasq-dns-6447ccbd8f-2j4xd\" (UID: \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" Jan 10 16:47:39 crc kubenswrapper[5036]: I0110 16:47:39.730151 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" Jan 10 16:47:39 crc kubenswrapper[5036]: I0110 16:47:39.931499 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1","Type":"ContainerStarted","Data":"af24954717f248cf9f183d73afddd0a2de799fb493f2da389096366d1e6fd728"} Jan 10 16:47:40 crc kubenswrapper[5036]: I0110 16:47:40.321438 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2j4xd"] Jan 10 16:47:40 crc kubenswrapper[5036]: W0110 16:47:40.342853 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce527b2a_a6be_4f43_86b1_2b1497a081ca.slice/crio-e7e7b5e035ab434891b42a3a8f247ee5901269c7ac0f690b3dfa7d01a18ef263 WatchSource:0}: Error finding container e7e7b5e035ab434891b42a3a8f247ee5901269c7ac0f690b3dfa7d01a18ef263: Status 404 returned error can't find the container with id e7e7b5e035ab434891b42a3a8f247ee5901269c7ac0f690b3dfa7d01a18ef263 Jan 10 16:47:40 crc kubenswrapper[5036]: I0110 16:47:40.939115 5036 generic.go:334] "Generic (PLEG): container finished" podID="ce527b2a-a6be-4f43-86b1-2b1497a081ca" containerID="0f9509278a85000d1ecf2da71a83945973453f2d9c5051e4cbc525695fed1b1d" exitCode=0 Jan 10 16:47:40 crc kubenswrapper[5036]: I0110 16:47:40.939163 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" event={"ID":"ce527b2a-a6be-4f43-86b1-2b1497a081ca","Type":"ContainerDied","Data":"0f9509278a85000d1ecf2da71a83945973453f2d9c5051e4cbc525695fed1b1d"} Jan 10 16:47:40 crc kubenswrapper[5036]: I0110 16:47:40.939463 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" event={"ID":"ce527b2a-a6be-4f43-86b1-2b1497a081ca","Type":"ContainerStarted","Data":"e7e7b5e035ab434891b42a3a8f247ee5901269c7ac0f690b3dfa7d01a18ef263"} Jan 10 16:47:41 crc kubenswrapper[5036]: I0110 16:47:41.953363 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" event={"ID":"ce527b2a-a6be-4f43-86b1-2b1497a081ca","Type":"ContainerStarted","Data":"2f04b1e13cffb35fbb645dc877dac6ab04de6d1b18c402f170f420c7ea5ecba5"} Jan 10 16:47:41 crc kubenswrapper[5036]: I0110 16:47:41.953652 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" Jan 10 16:47:42 crc kubenswrapper[5036]: I0110 16:47:41.982601 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" podStartSLOduration=2.982577114 podStartE2EDuration="2.982577114s" podCreationTimestamp="2026-01-10 16:47:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:47:41.976372698 +0000 UTC m=+1183.846608202" watchObservedRunningTime="2026-01-10 16:47:41.982577114 +0000 UTC m=+1183.852812618" Jan 10 16:47:49 crc kubenswrapper[5036]: I0110 16:47:49.731903 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" Jan 10 16:47:49 crc kubenswrapper[5036]: I0110 16:47:49.815411 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-8mt6q"] Jan 10 16:47:49 crc kubenswrapper[5036]: I0110 16:47:49.815711 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" podUID="d33c7366-b7b9-41d6-89ca-c71bc7561466" containerName="dnsmasq-dns" containerID="cri-o://cd9e7a9f8b808facfb30e722559122ab795020d0b0d139cea91114c4c688d58c" gracePeriod=10 Jan 10 16:47:49 crc kubenswrapper[5036]: I0110 16:47:49.886156 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" podUID="d33c7366-b7b9-41d6-89ca-c71bc7561466" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.183:5353: connect: connection refused" Jan 10 16:47:49 crc kubenswrapper[5036]: I0110 16:47:49.983734 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-pg8hr"] Jan 10 16:47:49 crc kubenswrapper[5036]: I0110 16:47:49.985979 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.016718 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-pg8hr"] Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.033689 5036 generic.go:334] "Generic (PLEG): container finished" podID="d33c7366-b7b9-41d6-89ca-c71bc7561466" containerID="cd9e7a9f8b808facfb30e722559122ab795020d0b0d139cea91114c4c688d58c" exitCode=0 Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.033740 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" event={"ID":"d33c7366-b7b9-41d6-89ca-c71bc7561466","Type":"ContainerDied","Data":"cd9e7a9f8b808facfb30e722559122ab795020d0b0d139cea91114c4c688d58c"} Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.145613 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-pg8hr\" (UID: \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\") " pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.145698 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-pg8hr\" (UID: \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\") " pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.145736 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh527\" (UniqueName: \"kubernetes.io/projected/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-kube-api-access-qh527\") pod \"dnsmasq-dns-864d5fc68c-pg8hr\" (UID: \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\") " pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.145806 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-config\") pod \"dnsmasq-dns-864d5fc68c-pg8hr\" (UID: \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\") " pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.145985 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-pg8hr\" (UID: \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\") " pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.146054 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-pg8hr\" (UID: \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\") " pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.247609 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-pg8hr\" (UID: \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\") " pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.247719 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-pg8hr\" (UID: \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\") " pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.247759 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh527\" (UniqueName: \"kubernetes.io/projected/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-kube-api-access-qh527\") pod \"dnsmasq-dns-864d5fc68c-pg8hr\" (UID: \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\") " pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.247807 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-config\") pod \"dnsmasq-dns-864d5fc68c-pg8hr\" (UID: \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\") " pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.247851 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-pg8hr\" (UID: \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\") " pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.247879 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-pg8hr\" (UID: \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\") " pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.248830 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-pg8hr\" (UID: \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\") " pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.249181 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-pg8hr\" (UID: \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\") " pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.249415 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-pg8hr\" (UID: \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\") " pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.249615 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-config\") pod \"dnsmasq-dns-864d5fc68c-pg8hr\" (UID: \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\") " pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.249743 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-pg8hr\" (UID: \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\") " pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.276118 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh527\" (UniqueName: \"kubernetes.io/projected/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-kube-api-access-qh527\") pod \"dnsmasq-dns-864d5fc68c-pg8hr\" (UID: \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\") " pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.318913 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.328054 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.452211 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d33c7366-b7b9-41d6-89ca-c71bc7561466-dns-svc\") pod \"d33c7366-b7b9-41d6-89ca-c71bc7561466\" (UID: \"d33c7366-b7b9-41d6-89ca-c71bc7561466\") " Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.452433 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d33c7366-b7b9-41d6-89ca-c71bc7561466-config\") pod \"d33c7366-b7b9-41d6-89ca-c71bc7561466\" (UID: \"d33c7366-b7b9-41d6-89ca-c71bc7561466\") " Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.452548 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d33c7366-b7b9-41d6-89ca-c71bc7561466-ovsdbserver-sb\") pod \"d33c7366-b7b9-41d6-89ca-c71bc7561466\" (UID: \"d33c7366-b7b9-41d6-89ca-c71bc7561466\") " Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.452596 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff89t\" (UniqueName: \"kubernetes.io/projected/d33c7366-b7b9-41d6-89ca-c71bc7561466-kube-api-access-ff89t\") pod \"d33c7366-b7b9-41d6-89ca-c71bc7561466\" (UID: \"d33c7366-b7b9-41d6-89ca-c71bc7561466\") " Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.452621 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d33c7366-b7b9-41d6-89ca-c71bc7561466-ovsdbserver-nb\") pod \"d33c7366-b7b9-41d6-89ca-c71bc7561466\" (UID: \"d33c7366-b7b9-41d6-89ca-c71bc7561466\") " Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.486200 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d33c7366-b7b9-41d6-89ca-c71bc7561466-kube-api-access-ff89t" (OuterVolumeSpecName: "kube-api-access-ff89t") pod "d33c7366-b7b9-41d6-89ca-c71bc7561466" (UID: "d33c7366-b7b9-41d6-89ca-c71bc7561466"). InnerVolumeSpecName "kube-api-access-ff89t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.554483 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff89t\" (UniqueName: \"kubernetes.io/projected/d33c7366-b7b9-41d6-89ca-c71bc7561466-kube-api-access-ff89t\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.564383 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d33c7366-b7b9-41d6-89ca-c71bc7561466-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d33c7366-b7b9-41d6-89ca-c71bc7561466" (UID: "d33c7366-b7b9-41d6-89ca-c71bc7561466"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.585599 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d33c7366-b7b9-41d6-89ca-c71bc7561466-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d33c7366-b7b9-41d6-89ca-c71bc7561466" (UID: "d33c7366-b7b9-41d6-89ca-c71bc7561466"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.643518 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d33c7366-b7b9-41d6-89ca-c71bc7561466-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d33c7366-b7b9-41d6-89ca-c71bc7561466" (UID: "d33c7366-b7b9-41d6-89ca-c71bc7561466"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.649975 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d33c7366-b7b9-41d6-89ca-c71bc7561466-config" (OuterVolumeSpecName: "config") pod "d33c7366-b7b9-41d6-89ca-c71bc7561466" (UID: "d33c7366-b7b9-41d6-89ca-c71bc7561466"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.656238 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d33c7366-b7b9-41d6-89ca-c71bc7561466-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.656268 5036 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d33c7366-b7b9-41d6-89ca-c71bc7561466-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.656281 5036 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d33c7366-b7b9-41d6-89ca-c71bc7561466-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.656290 5036 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d33c7366-b7b9-41d6-89ca-c71bc7561466-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 10 16:47:50 crc kubenswrapper[5036]: W0110 16:47:50.946490 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6ce8785_ce6a_4cf4_a5d8_b5f84da029d0.slice/crio-87561d71935330ec0435fa37ce00337a476051cde22a0b9c143bc9a8e880f532 WatchSource:0}: Error finding container 87561d71935330ec0435fa37ce00337a476051cde22a0b9c143bc9a8e880f532: Status 404 returned error can't find the container with id 87561d71935330ec0435fa37ce00337a476051cde22a0b9c143bc9a8e880f532 Jan 10 16:47:50 crc kubenswrapper[5036]: I0110 16:47:50.946747 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-pg8hr"] Jan 10 16:47:51 crc kubenswrapper[5036]: I0110 16:47:51.048071 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" Jan 10 16:47:51 crc kubenswrapper[5036]: I0110 16:47:51.048068 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-8mt6q" event={"ID":"d33c7366-b7b9-41d6-89ca-c71bc7561466","Type":"ContainerDied","Data":"15bedd485bae645f6f7650fe6c0daf0c12cbbbb099ed29a21042e96c72d70056"} Jan 10 16:47:51 crc kubenswrapper[5036]: I0110 16:47:51.048155 5036 scope.go:117] "RemoveContainer" containerID="cd9e7a9f8b808facfb30e722559122ab795020d0b0d139cea91114c4c688d58c" Jan 10 16:47:51 crc kubenswrapper[5036]: I0110 16:47:51.050525 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" event={"ID":"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0","Type":"ContainerStarted","Data":"87561d71935330ec0435fa37ce00337a476051cde22a0b9c143bc9a8e880f532"} Jan 10 16:47:51 crc kubenswrapper[5036]: I0110 16:47:51.097453 5036 scope.go:117] "RemoveContainer" containerID="1ad2febe5c8f68ea344c14f9ea156485f20aa6081caee872dffcba4d0c8316b2" Jan 10 16:47:51 crc kubenswrapper[5036]: I0110 16:47:51.132135 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-8mt6q"] Jan 10 16:47:51 crc kubenswrapper[5036]: I0110 16:47:51.140230 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-8mt6q"] Jan 10 16:47:52 crc kubenswrapper[5036]: I0110 16:47:52.066208 5036 generic.go:334] "Generic (PLEG): container finished" podID="a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0" containerID="556c96ba7bb7dff6a497d182daf22a4d471d73a458921b83ae68931133573551" exitCode=0 Jan 10 16:47:52 crc kubenswrapper[5036]: I0110 16:47:52.066340 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" event={"ID":"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0","Type":"ContainerDied","Data":"556c96ba7bb7dff6a497d182daf22a4d471d73a458921b83ae68931133573551"} Jan 10 16:47:52 crc kubenswrapper[5036]: I0110 16:47:52.518160 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d33c7366-b7b9-41d6-89ca-c71bc7561466" path="/var/lib/kubelet/pods/d33c7366-b7b9-41d6-89ca-c71bc7561466/volumes" Jan 10 16:47:53 crc kubenswrapper[5036]: I0110 16:47:53.078114 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" event={"ID":"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0","Type":"ContainerStarted","Data":"8f744aa4622ad38aa33f50eeef3b061d0b282c6c7b5418edefd6b2e8902fa5a5"} Jan 10 16:47:53 crc kubenswrapper[5036]: I0110 16:47:53.078250 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" Jan 10 16:47:53 crc kubenswrapper[5036]: I0110 16:47:53.101646 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" podStartSLOduration=4.101627701 podStartE2EDuration="4.101627701s" podCreationTimestamp="2026-01-10 16:47:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:47:53.091466203 +0000 UTC m=+1194.961701697" watchObservedRunningTime="2026-01-10 16:47:53.101627701 +0000 UTC m=+1194.971863205" Jan 10 16:47:55 crc kubenswrapper[5036]: I0110 16:47:55.904051 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 16:47:55 crc kubenswrapper[5036]: I0110 16:47:55.904368 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 16:48:00 crc kubenswrapper[5036]: I0110 16:48:00.322110 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" Jan 10 16:48:00 crc kubenswrapper[5036]: I0110 16:48:00.399259 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2j4xd"] Jan 10 16:48:00 crc kubenswrapper[5036]: I0110 16:48:00.399508 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" podUID="ce527b2a-a6be-4f43-86b1-2b1497a081ca" containerName="dnsmasq-dns" containerID="cri-o://2f04b1e13cffb35fbb645dc877dac6ab04de6d1b18c402f170f420c7ea5ecba5" gracePeriod=10 Jan 10 16:48:00 crc kubenswrapper[5036]: I0110 16:48:00.893200 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.059113 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-ovsdbserver-sb\") pod \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\" (UID: \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\") " Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.059181 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-ovsdbserver-nb\") pod \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\" (UID: \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\") " Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.059220 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-config\") pod \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\" (UID: \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\") " Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.059277 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-openstack-edpm-ipam\") pod \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\" (UID: \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\") " Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.059338 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmcd8\" (UniqueName: \"kubernetes.io/projected/ce527b2a-a6be-4f43-86b1-2b1497a081ca-kube-api-access-tmcd8\") pod \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\" (UID: \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\") " Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.059377 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-dns-svc\") pod \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\" (UID: \"ce527b2a-a6be-4f43-86b1-2b1497a081ca\") " Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.064554 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce527b2a-a6be-4f43-86b1-2b1497a081ca-kube-api-access-tmcd8" (OuterVolumeSpecName: "kube-api-access-tmcd8") pod "ce527b2a-a6be-4f43-86b1-2b1497a081ca" (UID: "ce527b2a-a6be-4f43-86b1-2b1497a081ca"). InnerVolumeSpecName "kube-api-access-tmcd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.103928 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce527b2a-a6be-4f43-86b1-2b1497a081ca" (UID: "ce527b2a-a6be-4f43-86b1-2b1497a081ca"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.104702 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce527b2a-a6be-4f43-86b1-2b1497a081ca" (UID: "ce527b2a-a6be-4f43-86b1-2b1497a081ca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.106977 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-config" (OuterVolumeSpecName: "config") pod "ce527b2a-a6be-4f43-86b1-2b1497a081ca" (UID: "ce527b2a-a6be-4f43-86b1-2b1497a081ca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.109619 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce527b2a-a6be-4f43-86b1-2b1497a081ca" (UID: "ce527b2a-a6be-4f43-86b1-2b1497a081ca"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.110696 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "ce527b2a-a6be-4f43-86b1-2b1497a081ca" (UID: "ce527b2a-a6be-4f43-86b1-2b1497a081ca"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.159891 5036 generic.go:334] "Generic (PLEG): container finished" podID="ce527b2a-a6be-4f43-86b1-2b1497a081ca" containerID="2f04b1e13cffb35fbb645dc877dac6ab04de6d1b18c402f170f420c7ea5ecba5" exitCode=0 Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.159958 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" event={"ID":"ce527b2a-a6be-4f43-86b1-2b1497a081ca","Type":"ContainerDied","Data":"2f04b1e13cffb35fbb645dc877dac6ab04de6d1b18c402f170f420c7ea5ecba5"} Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.159984 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" event={"ID":"ce527b2a-a6be-4f43-86b1-2b1497a081ca","Type":"ContainerDied","Data":"e7e7b5e035ab434891b42a3a8f247ee5901269c7ac0f690b3dfa7d01a18ef263"} Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.160053 5036 scope.go:117] "RemoveContainer" containerID="2f04b1e13cffb35fbb645dc877dac6ab04de6d1b18c402f170f420c7ea5ecba5" Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.160767 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-2j4xd" Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.160842 5036 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.161060 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-config\") on node \"crc\" DevicePath \"\"" Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.161103 5036 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.161113 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmcd8\" (UniqueName: \"kubernetes.io/projected/ce527b2a-a6be-4f43-86b1-2b1497a081ca-kube-api-access-tmcd8\") on node \"crc\" DevicePath \"\"" Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.161123 5036 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.161132 5036 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce527b2a-a6be-4f43-86b1-2b1497a081ca-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.200814 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2j4xd"] Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.201866 5036 scope.go:117] "RemoveContainer" containerID="0f9509278a85000d1ecf2da71a83945973453f2d9c5051e4cbc525695fed1b1d" Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.204711 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2j4xd"] Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.219916 5036 scope.go:117] "RemoveContainer" containerID="2f04b1e13cffb35fbb645dc877dac6ab04de6d1b18c402f170f420c7ea5ecba5" Jan 10 16:48:01 crc kubenswrapper[5036]: E0110 16:48:01.220226 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f04b1e13cffb35fbb645dc877dac6ab04de6d1b18c402f170f420c7ea5ecba5\": container with ID starting with 2f04b1e13cffb35fbb645dc877dac6ab04de6d1b18c402f170f420c7ea5ecba5 not found: ID does not exist" containerID="2f04b1e13cffb35fbb645dc877dac6ab04de6d1b18c402f170f420c7ea5ecba5" Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.220273 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f04b1e13cffb35fbb645dc877dac6ab04de6d1b18c402f170f420c7ea5ecba5"} err="failed to get container status \"2f04b1e13cffb35fbb645dc877dac6ab04de6d1b18c402f170f420c7ea5ecba5\": rpc error: code = NotFound desc = could not find container \"2f04b1e13cffb35fbb645dc877dac6ab04de6d1b18c402f170f420c7ea5ecba5\": container with ID starting with 2f04b1e13cffb35fbb645dc877dac6ab04de6d1b18c402f170f420c7ea5ecba5 not found: ID does not exist" Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.220309 5036 scope.go:117] "RemoveContainer" containerID="0f9509278a85000d1ecf2da71a83945973453f2d9c5051e4cbc525695fed1b1d" Jan 10 16:48:01 crc kubenswrapper[5036]: E0110 16:48:01.220918 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f9509278a85000d1ecf2da71a83945973453f2d9c5051e4cbc525695fed1b1d\": container with ID starting with 0f9509278a85000d1ecf2da71a83945973453f2d9c5051e4cbc525695fed1b1d not found: ID does not exist" containerID="0f9509278a85000d1ecf2da71a83945973453f2d9c5051e4cbc525695fed1b1d" Jan 10 16:48:01 crc kubenswrapper[5036]: I0110 16:48:01.220939 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f9509278a85000d1ecf2da71a83945973453f2d9c5051e4cbc525695fed1b1d"} err="failed to get container status \"0f9509278a85000d1ecf2da71a83945973453f2d9c5051e4cbc525695fed1b1d\": rpc error: code = NotFound desc = could not find container \"0f9509278a85000d1ecf2da71a83945973453f2d9c5051e4cbc525695fed1b1d\": container with ID starting with 0f9509278a85000d1ecf2da71a83945973453f2d9c5051e4cbc525695fed1b1d not found: ID does not exist" Jan 10 16:48:02 crc kubenswrapper[5036]: I0110 16:48:02.520496 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce527b2a-a6be-4f43-86b1-2b1497a081ca" path="/var/lib/kubelet/pods/ce527b2a-a6be-4f43-86b1-2b1497a081ca/volumes" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.259694 5036 generic.go:334] "Generic (PLEG): container finished" podID="e33d0131-d1d9-42cb-b772-7fe9835cee44" containerID="7d3ca660e62e51fb3cbf78bd2774de704ec2720789ade7f4b2c5ed496e976acf" exitCode=0 Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.259826 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e33d0131-d1d9-42cb-b772-7fe9835cee44","Type":"ContainerDied","Data":"7d3ca660e62e51fb3cbf78bd2774de704ec2720789ade7f4b2c5ed496e976acf"} Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.500031 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6"] Jan 10 16:48:10 crc kubenswrapper[5036]: E0110 16:48:10.500994 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce527b2a-a6be-4f43-86b1-2b1497a081ca" containerName="init" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.501013 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce527b2a-a6be-4f43-86b1-2b1497a081ca" containerName="init" Jan 10 16:48:10 crc kubenswrapper[5036]: E0110 16:48:10.501031 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce527b2a-a6be-4f43-86b1-2b1497a081ca" containerName="dnsmasq-dns" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.501038 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce527b2a-a6be-4f43-86b1-2b1497a081ca" containerName="dnsmasq-dns" Jan 10 16:48:10 crc kubenswrapper[5036]: E0110 16:48:10.501056 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d33c7366-b7b9-41d6-89ca-c71bc7561466" containerName="init" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.501063 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="d33c7366-b7b9-41d6-89ca-c71bc7561466" containerName="init" Jan 10 16:48:10 crc kubenswrapper[5036]: E0110 16:48:10.501080 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d33c7366-b7b9-41d6-89ca-c71bc7561466" containerName="dnsmasq-dns" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.501086 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="d33c7366-b7b9-41d6-89ca-c71bc7561466" containerName="dnsmasq-dns" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.501257 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="d33c7366-b7b9-41d6-89ca-c71bc7561466" containerName="dnsmasq-dns" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.501269 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce527b2a-a6be-4f43-86b1-2b1497a081ca" containerName="dnsmasq-dns" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.501882 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.504976 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.505408 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.505720 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.505977 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.528778 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6"] Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.549866 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ddc428b-2df4-4b8e-935e-cd07abb35a50-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6\" (UID: \"0ddc428b-2df4-4b8e-935e-cd07abb35a50\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.549956 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsbvx\" (UniqueName: \"kubernetes.io/projected/0ddc428b-2df4-4b8e-935e-cd07abb35a50-kube-api-access-lsbvx\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6\" (UID: \"0ddc428b-2df4-4b8e-935e-cd07abb35a50\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.549993 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ddc428b-2df4-4b8e-935e-cd07abb35a50-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6\" (UID: \"0ddc428b-2df4-4b8e-935e-cd07abb35a50\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.550042 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ddc428b-2df4-4b8e-935e-cd07abb35a50-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6\" (UID: \"0ddc428b-2df4-4b8e-935e-cd07abb35a50\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.651128 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsbvx\" (UniqueName: \"kubernetes.io/projected/0ddc428b-2df4-4b8e-935e-cd07abb35a50-kube-api-access-lsbvx\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6\" (UID: \"0ddc428b-2df4-4b8e-935e-cd07abb35a50\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.651215 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ddc428b-2df4-4b8e-935e-cd07abb35a50-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6\" (UID: \"0ddc428b-2df4-4b8e-935e-cd07abb35a50\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.651812 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ddc428b-2df4-4b8e-935e-cd07abb35a50-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6\" (UID: \"0ddc428b-2df4-4b8e-935e-cd07abb35a50\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.651961 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ddc428b-2df4-4b8e-935e-cd07abb35a50-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6\" (UID: \"0ddc428b-2df4-4b8e-935e-cd07abb35a50\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.656124 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ddc428b-2df4-4b8e-935e-cd07abb35a50-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6\" (UID: \"0ddc428b-2df4-4b8e-935e-cd07abb35a50\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.656569 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ddc428b-2df4-4b8e-935e-cd07abb35a50-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6\" (UID: \"0ddc428b-2df4-4b8e-935e-cd07abb35a50\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.671805 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ddc428b-2df4-4b8e-935e-cd07abb35a50-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6\" (UID: \"0ddc428b-2df4-4b8e-935e-cd07abb35a50\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.673524 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsbvx\" (UniqueName: \"kubernetes.io/projected/0ddc428b-2df4-4b8e-935e-cd07abb35a50-kube-api-access-lsbvx\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6\" (UID: \"0ddc428b-2df4-4b8e-935e-cd07abb35a50\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6" Jan 10 16:48:10 crc kubenswrapper[5036]: I0110 16:48:10.830823 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6" Jan 10 16:48:11 crc kubenswrapper[5036]: I0110 16:48:11.270520 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e33d0131-d1d9-42cb-b772-7fe9835cee44","Type":"ContainerStarted","Data":"65830a81fa65f023f0ad46ef2c158d2b3fb6979732e890c11615a70f4473f282"} Jan 10 16:48:11 crc kubenswrapper[5036]: I0110 16:48:11.271069 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 10 16:48:11 crc kubenswrapper[5036]: I0110 16:48:11.273926 5036 generic.go:334] "Generic (PLEG): container finished" podID="debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1" containerID="af24954717f248cf9f183d73afddd0a2de799fb493f2da389096366d1e6fd728" exitCode=0 Jan 10 16:48:11 crc kubenswrapper[5036]: I0110 16:48:11.273960 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1","Type":"ContainerDied","Data":"af24954717f248cf9f183d73afddd0a2de799fb493f2da389096366d1e6fd728"} Jan 10 16:48:11 crc kubenswrapper[5036]: I0110 16:48:11.307197 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.307171734 podStartE2EDuration="37.307171734s" podCreationTimestamp="2026-01-10 16:47:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:48:11.296786659 +0000 UTC m=+1213.167022163" watchObservedRunningTime="2026-01-10 16:48:11.307171734 +0000 UTC m=+1213.177407248" Jan 10 16:48:11 crc kubenswrapper[5036]: I0110 16:48:11.455099 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6"] Jan 10 16:48:11 crc kubenswrapper[5036]: I0110 16:48:11.466037 5036 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 10 16:48:12 crc kubenswrapper[5036]: I0110 16:48:12.283820 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6" event={"ID":"0ddc428b-2df4-4b8e-935e-cd07abb35a50","Type":"ContainerStarted","Data":"d8663136e985f5837efc7be045ff613067c62c2a49d1792163af2d4bf053d585"} Jan 10 16:48:12 crc kubenswrapper[5036]: I0110 16:48:12.286959 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1","Type":"ContainerStarted","Data":"2171c034d438913dcc9af2cfdb4b8a639f4d7e82d4aee50428a9fffd2cb62698"} Jan 10 16:48:12 crc kubenswrapper[5036]: I0110 16:48:12.288429 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:48:12 crc kubenswrapper[5036]: I0110 16:48:12.315526 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.315505387 podStartE2EDuration="37.315505387s" podCreationTimestamp="2026-01-10 16:47:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 16:48:12.310343881 +0000 UTC m=+1214.180579385" watchObservedRunningTime="2026-01-10 16:48:12.315505387 +0000 UTC m=+1214.185740891" Jan 10 16:48:21 crc kubenswrapper[5036]: I0110 16:48:21.389706 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6" event={"ID":"0ddc428b-2df4-4b8e-935e-cd07abb35a50","Type":"ContainerStarted","Data":"cbead4e0ab9c072b5403183f12a0b05c59d2e5ce6af1916120449d5a32aff6cd"} Jan 10 16:48:21 crc kubenswrapper[5036]: I0110 16:48:21.410352 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6" podStartSLOduration=2.492820373 podStartE2EDuration="11.410330122s" podCreationTimestamp="2026-01-10 16:48:10 +0000 UTC" firstStartedPulling="2026-01-10 16:48:11.463831443 +0000 UTC m=+1213.334066937" lastFinishedPulling="2026-01-10 16:48:20.381341192 +0000 UTC m=+1222.251576686" observedRunningTime="2026-01-10 16:48:21.404118505 +0000 UTC m=+1223.274354009" watchObservedRunningTime="2026-01-10 16:48:21.410330122 +0000 UTC m=+1223.280565626" Jan 10 16:48:24 crc kubenswrapper[5036]: I0110 16:48:24.850396 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 10 16:48:25 crc kubenswrapper[5036]: I0110 16:48:25.904361 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 16:48:25 crc kubenswrapper[5036]: I0110 16:48:25.904885 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 16:48:26 crc kubenswrapper[5036]: I0110 16:48:26.579941 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 10 16:48:32 crc kubenswrapper[5036]: I0110 16:48:32.491114 5036 generic.go:334] "Generic (PLEG): container finished" podID="0ddc428b-2df4-4b8e-935e-cd07abb35a50" containerID="cbead4e0ab9c072b5403183f12a0b05c59d2e5ce6af1916120449d5a32aff6cd" exitCode=0 Jan 10 16:48:32 crc kubenswrapper[5036]: I0110 16:48:32.491532 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6" event={"ID":"0ddc428b-2df4-4b8e-935e-cd07abb35a50","Type":"ContainerDied","Data":"cbead4e0ab9c072b5403183f12a0b05c59d2e5ce6af1916120449d5a32aff6cd"} Jan 10 16:48:33 crc kubenswrapper[5036]: I0110 16:48:33.976069 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.047639 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ddc428b-2df4-4b8e-935e-cd07abb35a50-ssh-key-openstack-edpm-ipam\") pod \"0ddc428b-2df4-4b8e-935e-cd07abb35a50\" (UID: \"0ddc428b-2df4-4b8e-935e-cd07abb35a50\") " Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.047720 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ddc428b-2df4-4b8e-935e-cd07abb35a50-inventory\") pod \"0ddc428b-2df4-4b8e-935e-cd07abb35a50\" (UID: \"0ddc428b-2df4-4b8e-935e-cd07abb35a50\") " Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.047874 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsbvx\" (UniqueName: \"kubernetes.io/projected/0ddc428b-2df4-4b8e-935e-cd07abb35a50-kube-api-access-lsbvx\") pod \"0ddc428b-2df4-4b8e-935e-cd07abb35a50\" (UID: \"0ddc428b-2df4-4b8e-935e-cd07abb35a50\") " Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.047906 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ddc428b-2df4-4b8e-935e-cd07abb35a50-repo-setup-combined-ca-bundle\") pod \"0ddc428b-2df4-4b8e-935e-cd07abb35a50\" (UID: \"0ddc428b-2df4-4b8e-935e-cd07abb35a50\") " Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.055935 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ddc428b-2df4-4b8e-935e-cd07abb35a50-kube-api-access-lsbvx" (OuterVolumeSpecName: "kube-api-access-lsbvx") pod "0ddc428b-2df4-4b8e-935e-cd07abb35a50" (UID: "0ddc428b-2df4-4b8e-935e-cd07abb35a50"). InnerVolumeSpecName "kube-api-access-lsbvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.059691 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ddc428b-2df4-4b8e-935e-cd07abb35a50-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "0ddc428b-2df4-4b8e-935e-cd07abb35a50" (UID: "0ddc428b-2df4-4b8e-935e-cd07abb35a50"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.078119 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ddc428b-2df4-4b8e-935e-cd07abb35a50-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0ddc428b-2df4-4b8e-935e-cd07abb35a50" (UID: "0ddc428b-2df4-4b8e-935e-cd07abb35a50"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.078484 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ddc428b-2df4-4b8e-935e-cd07abb35a50-inventory" (OuterVolumeSpecName: "inventory") pod "0ddc428b-2df4-4b8e-935e-cd07abb35a50" (UID: "0ddc428b-2df4-4b8e-935e-cd07abb35a50"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.149483 5036 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ddc428b-2df4-4b8e-935e-cd07abb35a50-inventory\") on node \"crc\" DevicePath \"\"" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.149513 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsbvx\" (UniqueName: \"kubernetes.io/projected/0ddc428b-2df4-4b8e-935e-cd07abb35a50-kube-api-access-lsbvx\") on node \"crc\" DevicePath \"\"" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.149525 5036 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ddc428b-2df4-4b8e-935e-cd07abb35a50-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.149534 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0ddc428b-2df4-4b8e-935e-cd07abb35a50-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.515074 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.521737 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6" event={"ID":"0ddc428b-2df4-4b8e-935e-cd07abb35a50","Type":"ContainerDied","Data":"d8663136e985f5837efc7be045ff613067c62c2a49d1792163af2d4bf053d585"} Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.521765 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8663136e985f5837efc7be045ff613067c62c2a49d1792163af2d4bf053d585" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.606181 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg"] Jan 10 16:48:34 crc kubenswrapper[5036]: E0110 16:48:34.606672 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ddc428b-2df4-4b8e-935e-cd07abb35a50" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.606723 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ddc428b-2df4-4b8e-935e-cd07abb35a50" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.607012 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ddc428b-2df4-4b8e-935e-cd07abb35a50" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.607910 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.610484 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.610882 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.611059 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.611331 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.634800 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg"] Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.761843 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d19244c-8236-481b-9b50-b6a641c7b724-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg\" (UID: \"1d19244c-8236-481b-9b50-b6a641c7b724\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.761932 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d19244c-8236-481b-9b50-b6a641c7b724-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg\" (UID: \"1d19244c-8236-481b-9b50-b6a641c7b724\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.761980 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d19244c-8236-481b-9b50-b6a641c7b724-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg\" (UID: \"1d19244c-8236-481b-9b50-b6a641c7b724\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.762010 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqbxm\" (UniqueName: \"kubernetes.io/projected/1d19244c-8236-481b-9b50-b6a641c7b724-kube-api-access-xqbxm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg\" (UID: \"1d19244c-8236-481b-9b50-b6a641c7b724\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.864324 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d19244c-8236-481b-9b50-b6a641c7b724-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg\" (UID: \"1d19244c-8236-481b-9b50-b6a641c7b724\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.864428 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqbxm\" (UniqueName: \"kubernetes.io/projected/1d19244c-8236-481b-9b50-b6a641c7b724-kube-api-access-xqbxm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg\" (UID: \"1d19244c-8236-481b-9b50-b6a641c7b724\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.864563 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d19244c-8236-481b-9b50-b6a641c7b724-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg\" (UID: \"1d19244c-8236-481b-9b50-b6a641c7b724\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.864723 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d19244c-8236-481b-9b50-b6a641c7b724-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg\" (UID: \"1d19244c-8236-481b-9b50-b6a641c7b724\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.869904 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d19244c-8236-481b-9b50-b6a641c7b724-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg\" (UID: \"1d19244c-8236-481b-9b50-b6a641c7b724\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.871254 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d19244c-8236-481b-9b50-b6a641c7b724-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg\" (UID: \"1d19244c-8236-481b-9b50-b6a641c7b724\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.874485 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d19244c-8236-481b-9b50-b6a641c7b724-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg\" (UID: \"1d19244c-8236-481b-9b50-b6a641c7b724\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.884554 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqbxm\" (UniqueName: \"kubernetes.io/projected/1d19244c-8236-481b-9b50-b6a641c7b724-kube-api-access-xqbxm\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg\" (UID: \"1d19244c-8236-481b-9b50-b6a641c7b724\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg" Jan 10 16:48:34 crc kubenswrapper[5036]: I0110 16:48:34.929290 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg" Jan 10 16:48:35 crc kubenswrapper[5036]: I0110 16:48:35.480270 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg"] Jan 10 16:48:35 crc kubenswrapper[5036]: W0110 16:48:35.481949 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d19244c_8236_481b_9b50_b6a641c7b724.slice/crio-1beeefe38d2d1e8224617af27ebaf31e60f0590017f2a571c195abf332632935 WatchSource:0}: Error finding container 1beeefe38d2d1e8224617af27ebaf31e60f0590017f2a571c195abf332632935: Status 404 returned error can't find the container with id 1beeefe38d2d1e8224617af27ebaf31e60f0590017f2a571c195abf332632935 Jan 10 16:48:35 crc kubenswrapper[5036]: I0110 16:48:35.524370 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg" event={"ID":"1d19244c-8236-481b-9b50-b6a641c7b724","Type":"ContainerStarted","Data":"1beeefe38d2d1e8224617af27ebaf31e60f0590017f2a571c195abf332632935"} Jan 10 16:48:36 crc kubenswrapper[5036]: I0110 16:48:36.534484 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg" event={"ID":"1d19244c-8236-481b-9b50-b6a641c7b724","Type":"ContainerStarted","Data":"4c8534a5bf9b3e37f93bd5ed1c6a0f741332efeb5607e9f1b6c120b09a26db01"} Jan 10 16:48:36 crc kubenswrapper[5036]: I0110 16:48:36.556385 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg" podStartSLOduration=1.961630289 podStartE2EDuration="2.556323704s" podCreationTimestamp="2026-01-10 16:48:34 +0000 UTC" firstStartedPulling="2026-01-10 16:48:35.484946352 +0000 UTC m=+1237.355181846" lastFinishedPulling="2026-01-10 16:48:36.079639757 +0000 UTC m=+1237.949875261" observedRunningTime="2026-01-10 16:48:36.547011139 +0000 UTC m=+1238.417246673" watchObservedRunningTime="2026-01-10 16:48:36.556323704 +0000 UTC m=+1238.426559208" Jan 10 16:48:55 crc kubenswrapper[5036]: I0110 16:48:55.903911 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 16:48:55 crc kubenswrapper[5036]: I0110 16:48:55.904781 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 16:48:55 crc kubenswrapper[5036]: I0110 16:48:55.904849 5036 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 16:48:55 crc kubenswrapper[5036]: I0110 16:48:55.905964 5036 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d28b27960f834840be7757d03723d2d7badcd48dee80eda66e746096741e71be"} pod="openshift-machine-config-operator/machine-config-daemon-kqphb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 10 16:48:55 crc kubenswrapper[5036]: I0110 16:48:55.906065 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" containerID="cri-o://d28b27960f834840be7757d03723d2d7badcd48dee80eda66e746096741e71be" gracePeriod=600 Jan 10 16:48:56 crc kubenswrapper[5036]: I0110 16:48:56.728007 5036 generic.go:334] "Generic (PLEG): container finished" podID="79756361-741e-4470-831b-6ee092bc6277" containerID="d28b27960f834840be7757d03723d2d7badcd48dee80eda66e746096741e71be" exitCode=0 Jan 10 16:48:56 crc kubenswrapper[5036]: I0110 16:48:56.728083 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerDied","Data":"d28b27960f834840be7757d03723d2d7badcd48dee80eda66e746096741e71be"} Jan 10 16:48:56 crc kubenswrapper[5036]: I0110 16:48:56.728708 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerStarted","Data":"1b3cfa0819aeac4502d95e4d9f7b2ee845bbdb656e6de3b7c4e292249ece1785"} Jan 10 16:48:56 crc kubenswrapper[5036]: I0110 16:48:56.728733 5036 scope.go:117] "RemoveContainer" containerID="47b4506ff10880e72e9cad77a434855f34e50bd0e3f4e5d40320d062adfd7136" Jan 10 16:49:20 crc kubenswrapper[5036]: I0110 16:49:20.391753 5036 scope.go:117] "RemoveContainer" containerID="5c58205136eb086db1b39529e4118ad4635d97fc9b6476a5d671853e9e0aa9d9" Jan 10 16:50:20 crc kubenswrapper[5036]: I0110 16:50:20.482699 5036 scope.go:117] "RemoveContainer" containerID="68ae9389e51072775c29153c00dd3c84a5516d103a50ab9e818f1ccdae8235a2" Jan 10 16:50:20 crc kubenswrapper[5036]: I0110 16:50:20.516314 5036 scope.go:117] "RemoveContainer" containerID="9b50602cb81b4a10f23f70316e09fc6975bf9fbf04c93e6d7e6395516dea43f5" Jan 10 16:50:20 crc kubenswrapper[5036]: I0110 16:50:20.582584 5036 scope.go:117] "RemoveContainer" containerID="cdf2698d6d411afc14345f2ad6de4d7166f342a99c8f717405de6bc8e2679a44" Jan 10 16:51:25 crc kubenswrapper[5036]: I0110 16:51:25.904239 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 16:51:25 crc kubenswrapper[5036]: I0110 16:51:25.904831 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 16:51:55 crc kubenswrapper[5036]: I0110 16:51:55.904950 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 16:51:55 crc kubenswrapper[5036]: I0110 16:51:55.905604 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 16:51:59 crc kubenswrapper[5036]: I0110 16:51:59.549277 5036 generic.go:334] "Generic (PLEG): container finished" podID="1d19244c-8236-481b-9b50-b6a641c7b724" containerID="4c8534a5bf9b3e37f93bd5ed1c6a0f741332efeb5607e9f1b6c120b09a26db01" exitCode=0 Jan 10 16:51:59 crc kubenswrapper[5036]: I0110 16:51:59.549381 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg" event={"ID":"1d19244c-8236-481b-9b50-b6a641c7b724","Type":"ContainerDied","Data":"4c8534a5bf9b3e37f93bd5ed1c6a0f741332efeb5607e9f1b6c120b09a26db01"} Jan 10 16:52:00 crc kubenswrapper[5036]: I0110 16:52:00.981537 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.161787 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d19244c-8236-481b-9b50-b6a641c7b724-inventory\") pod \"1d19244c-8236-481b-9b50-b6a641c7b724\" (UID: \"1d19244c-8236-481b-9b50-b6a641c7b724\") " Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.161910 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d19244c-8236-481b-9b50-b6a641c7b724-bootstrap-combined-ca-bundle\") pod \"1d19244c-8236-481b-9b50-b6a641c7b724\" (UID: \"1d19244c-8236-481b-9b50-b6a641c7b724\") " Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.162056 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d19244c-8236-481b-9b50-b6a641c7b724-ssh-key-openstack-edpm-ipam\") pod \"1d19244c-8236-481b-9b50-b6a641c7b724\" (UID: \"1d19244c-8236-481b-9b50-b6a641c7b724\") " Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.162133 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqbxm\" (UniqueName: \"kubernetes.io/projected/1d19244c-8236-481b-9b50-b6a641c7b724-kube-api-access-xqbxm\") pod \"1d19244c-8236-481b-9b50-b6a641c7b724\" (UID: \"1d19244c-8236-481b-9b50-b6a641c7b724\") " Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.171126 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d19244c-8236-481b-9b50-b6a641c7b724-kube-api-access-xqbxm" (OuterVolumeSpecName: "kube-api-access-xqbxm") pod "1d19244c-8236-481b-9b50-b6a641c7b724" (UID: "1d19244c-8236-481b-9b50-b6a641c7b724"). InnerVolumeSpecName "kube-api-access-xqbxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.182297 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d19244c-8236-481b-9b50-b6a641c7b724-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "1d19244c-8236-481b-9b50-b6a641c7b724" (UID: "1d19244c-8236-481b-9b50-b6a641c7b724"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.198931 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d19244c-8236-481b-9b50-b6a641c7b724-inventory" (OuterVolumeSpecName: "inventory") pod "1d19244c-8236-481b-9b50-b6a641c7b724" (UID: "1d19244c-8236-481b-9b50-b6a641c7b724"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.198970 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d19244c-8236-481b-9b50-b6a641c7b724-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1d19244c-8236-481b-9b50-b6a641c7b724" (UID: "1d19244c-8236-481b-9b50-b6a641c7b724"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.264977 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d19244c-8236-481b-9b50-b6a641c7b724-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.265030 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqbxm\" (UniqueName: \"kubernetes.io/projected/1d19244c-8236-481b-9b50-b6a641c7b724-kube-api-access-xqbxm\") on node \"crc\" DevicePath \"\"" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.265044 5036 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d19244c-8236-481b-9b50-b6a641c7b724-inventory\") on node \"crc\" DevicePath \"\"" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.265053 5036 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d19244c-8236-481b-9b50-b6a641c7b724-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.568588 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg" event={"ID":"1d19244c-8236-481b-9b50-b6a641c7b724","Type":"ContainerDied","Data":"1beeefe38d2d1e8224617af27ebaf31e60f0590017f2a571c195abf332632935"} Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.568949 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1beeefe38d2d1e8224617af27ebaf31e60f0590017f2a571c195abf332632935" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.568774 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.675828 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz"] Jan 10 16:52:01 crc kubenswrapper[5036]: E0110 16:52:01.676138 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d19244c-8236-481b-9b50-b6a641c7b724" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.676154 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d19244c-8236-481b-9b50-b6a641c7b724" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.676314 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d19244c-8236-481b-9b50-b6a641c7b724" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.676881 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.679315 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.679555 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.679933 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.680145 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.688841 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz"] Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.876099 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4725v\" (UniqueName: \"kubernetes.io/projected/3c5a7464-a4c9-4aed-a25f-1d19266239b4-kube-api-access-4725v\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz\" (UID: \"3c5a7464-a4c9-4aed-a25f-1d19266239b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.876184 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c5a7464-a4c9-4aed-a25f-1d19266239b4-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz\" (UID: \"3c5a7464-a4c9-4aed-a25f-1d19266239b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.876217 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c5a7464-a4c9-4aed-a25f-1d19266239b4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz\" (UID: \"3c5a7464-a4c9-4aed-a25f-1d19266239b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.978251 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4725v\" (UniqueName: \"kubernetes.io/projected/3c5a7464-a4c9-4aed-a25f-1d19266239b4-kube-api-access-4725v\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz\" (UID: \"3c5a7464-a4c9-4aed-a25f-1d19266239b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.978327 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c5a7464-a4c9-4aed-a25f-1d19266239b4-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz\" (UID: \"3c5a7464-a4c9-4aed-a25f-1d19266239b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.978349 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c5a7464-a4c9-4aed-a25f-1d19266239b4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz\" (UID: \"3c5a7464-a4c9-4aed-a25f-1d19266239b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.982805 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c5a7464-a4c9-4aed-a25f-1d19266239b4-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz\" (UID: \"3c5a7464-a4c9-4aed-a25f-1d19266239b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.987238 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c5a7464-a4c9-4aed-a25f-1d19266239b4-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz\" (UID: \"3c5a7464-a4c9-4aed-a25f-1d19266239b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz" Jan 10 16:52:01 crc kubenswrapper[5036]: I0110 16:52:01.997735 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4725v\" (UniqueName: \"kubernetes.io/projected/3c5a7464-a4c9-4aed-a25f-1d19266239b4-kube-api-access-4725v\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz\" (UID: \"3c5a7464-a4c9-4aed-a25f-1d19266239b4\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz" Jan 10 16:52:02 crc kubenswrapper[5036]: I0110 16:52:02.294889 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz" Jan 10 16:52:02 crc kubenswrapper[5036]: I0110 16:52:02.889203 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz"] Jan 10 16:52:02 crc kubenswrapper[5036]: W0110 16:52:02.889368 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c5a7464_a4c9_4aed_a25f_1d19266239b4.slice/crio-8927c064114a6a1e76419dd25fde3834f922560c14358899b6dfb958649fb164 WatchSource:0}: Error finding container 8927c064114a6a1e76419dd25fde3834f922560c14358899b6dfb958649fb164: Status 404 returned error can't find the container with id 8927c064114a6a1e76419dd25fde3834f922560c14358899b6dfb958649fb164 Jan 10 16:52:03 crc kubenswrapper[5036]: I0110 16:52:03.598020 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz" event={"ID":"3c5a7464-a4c9-4aed-a25f-1d19266239b4","Type":"ContainerStarted","Data":"8927c064114a6a1e76419dd25fde3834f922560c14358899b6dfb958649fb164"} Jan 10 16:52:04 crc kubenswrapper[5036]: I0110 16:52:04.607477 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz" event={"ID":"3c5a7464-a4c9-4aed-a25f-1d19266239b4","Type":"ContainerStarted","Data":"154ea3fa105911d8e7aa267185cf51e57c2a00885b21fae621da532db74454ea"} Jan 10 16:52:04 crc kubenswrapper[5036]: I0110 16:52:04.689547 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz" podStartSLOduration=2.914540212 podStartE2EDuration="3.689525332s" podCreationTimestamp="2026-01-10 16:52:01 +0000 UTC" firstStartedPulling="2026-01-10 16:52:02.892298574 +0000 UTC m=+1444.762534078" lastFinishedPulling="2026-01-10 16:52:03.667283704 +0000 UTC m=+1445.537519198" observedRunningTime="2026-01-10 16:52:04.642880601 +0000 UTC m=+1446.513116095" watchObservedRunningTime="2026-01-10 16:52:04.689525332 +0000 UTC m=+1446.559760826" Jan 10 16:52:20 crc kubenswrapper[5036]: I0110 16:52:20.685970 5036 scope.go:117] "RemoveContainer" containerID="062f4cc5c2f78ab1962c592ee9b9dd067adf2b9f1d9c840c347c47ba459c70a3" Jan 10 16:52:25 crc kubenswrapper[5036]: I0110 16:52:25.904083 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 16:52:25 crc kubenswrapper[5036]: I0110 16:52:25.904617 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 16:52:25 crc kubenswrapper[5036]: I0110 16:52:25.904660 5036 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 16:52:25 crc kubenswrapper[5036]: I0110 16:52:25.905329 5036 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1b3cfa0819aeac4502d95e4d9f7b2ee845bbdb656e6de3b7c4e292249ece1785"} pod="openshift-machine-config-operator/machine-config-daemon-kqphb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 10 16:52:25 crc kubenswrapper[5036]: I0110 16:52:25.905401 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" containerID="cri-o://1b3cfa0819aeac4502d95e4d9f7b2ee845bbdb656e6de3b7c4e292249ece1785" gracePeriod=600 Jan 10 16:52:26 crc kubenswrapper[5036]: I0110 16:52:26.794373 5036 generic.go:334] "Generic (PLEG): container finished" podID="79756361-741e-4470-831b-6ee092bc6277" containerID="1b3cfa0819aeac4502d95e4d9f7b2ee845bbdb656e6de3b7c4e292249ece1785" exitCode=0 Jan 10 16:52:26 crc kubenswrapper[5036]: I0110 16:52:26.794471 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerDied","Data":"1b3cfa0819aeac4502d95e4d9f7b2ee845bbdb656e6de3b7c4e292249ece1785"} Jan 10 16:52:26 crc kubenswrapper[5036]: I0110 16:52:26.794732 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerStarted","Data":"4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219"} Jan 10 16:52:26 crc kubenswrapper[5036]: I0110 16:52:26.794761 5036 scope.go:117] "RemoveContainer" containerID="d28b27960f834840be7757d03723d2d7badcd48dee80eda66e746096741e71be" Jan 10 16:52:40 crc kubenswrapper[5036]: I0110 16:52:40.198548 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g5dzj"] Jan 10 16:52:40 crc kubenswrapper[5036]: I0110 16:52:40.200919 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g5dzj" Jan 10 16:52:40 crc kubenswrapper[5036]: I0110 16:52:40.220134 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g5dzj"] Jan 10 16:52:40 crc kubenswrapper[5036]: I0110 16:52:40.302474 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njcxc\" (UniqueName: \"kubernetes.io/projected/ca6c2b0b-67f6-49a0-b686-14486919f888-kube-api-access-njcxc\") pod \"redhat-marketplace-g5dzj\" (UID: \"ca6c2b0b-67f6-49a0-b686-14486919f888\") " pod="openshift-marketplace/redhat-marketplace-g5dzj" Jan 10 16:52:40 crc kubenswrapper[5036]: I0110 16:52:40.302569 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca6c2b0b-67f6-49a0-b686-14486919f888-catalog-content\") pod \"redhat-marketplace-g5dzj\" (UID: \"ca6c2b0b-67f6-49a0-b686-14486919f888\") " pod="openshift-marketplace/redhat-marketplace-g5dzj" Jan 10 16:52:40 crc kubenswrapper[5036]: I0110 16:52:40.302745 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca6c2b0b-67f6-49a0-b686-14486919f888-utilities\") pod \"redhat-marketplace-g5dzj\" (UID: \"ca6c2b0b-67f6-49a0-b686-14486919f888\") " pod="openshift-marketplace/redhat-marketplace-g5dzj" Jan 10 16:52:40 crc kubenswrapper[5036]: I0110 16:52:40.403909 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca6c2b0b-67f6-49a0-b686-14486919f888-utilities\") pod \"redhat-marketplace-g5dzj\" (UID: \"ca6c2b0b-67f6-49a0-b686-14486919f888\") " pod="openshift-marketplace/redhat-marketplace-g5dzj" Jan 10 16:52:40 crc kubenswrapper[5036]: I0110 16:52:40.404289 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njcxc\" (UniqueName: \"kubernetes.io/projected/ca6c2b0b-67f6-49a0-b686-14486919f888-kube-api-access-njcxc\") pod \"redhat-marketplace-g5dzj\" (UID: \"ca6c2b0b-67f6-49a0-b686-14486919f888\") " pod="openshift-marketplace/redhat-marketplace-g5dzj" Jan 10 16:52:40 crc kubenswrapper[5036]: I0110 16:52:40.404324 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca6c2b0b-67f6-49a0-b686-14486919f888-catalog-content\") pod \"redhat-marketplace-g5dzj\" (UID: \"ca6c2b0b-67f6-49a0-b686-14486919f888\") " pod="openshift-marketplace/redhat-marketplace-g5dzj" Jan 10 16:52:40 crc kubenswrapper[5036]: I0110 16:52:40.404573 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca6c2b0b-67f6-49a0-b686-14486919f888-utilities\") pod \"redhat-marketplace-g5dzj\" (UID: \"ca6c2b0b-67f6-49a0-b686-14486919f888\") " pod="openshift-marketplace/redhat-marketplace-g5dzj" Jan 10 16:52:40 crc kubenswrapper[5036]: I0110 16:52:40.404644 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca6c2b0b-67f6-49a0-b686-14486919f888-catalog-content\") pod \"redhat-marketplace-g5dzj\" (UID: \"ca6c2b0b-67f6-49a0-b686-14486919f888\") " pod="openshift-marketplace/redhat-marketplace-g5dzj" Jan 10 16:52:40 crc kubenswrapper[5036]: I0110 16:52:40.422995 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njcxc\" (UniqueName: \"kubernetes.io/projected/ca6c2b0b-67f6-49a0-b686-14486919f888-kube-api-access-njcxc\") pod \"redhat-marketplace-g5dzj\" (UID: \"ca6c2b0b-67f6-49a0-b686-14486919f888\") " pod="openshift-marketplace/redhat-marketplace-g5dzj" Jan 10 16:52:40 crc kubenswrapper[5036]: I0110 16:52:40.544799 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g5dzj" Jan 10 16:52:40 crc kubenswrapper[5036]: W0110 16:52:40.996181 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca6c2b0b_67f6_49a0_b686_14486919f888.slice/crio-600d44a7576ca7de6cc41f1d596ba50e4c6c99e7216f5b17bf8358647598a04f WatchSource:0}: Error finding container 600d44a7576ca7de6cc41f1d596ba50e4c6c99e7216f5b17bf8358647598a04f: Status 404 returned error can't find the container with id 600d44a7576ca7de6cc41f1d596ba50e4c6c99e7216f5b17bf8358647598a04f Jan 10 16:52:41 crc kubenswrapper[5036]: I0110 16:52:41.000582 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g5dzj"] Jan 10 16:52:41 crc kubenswrapper[5036]: I0110 16:52:41.935007 5036 generic.go:334] "Generic (PLEG): container finished" podID="ca6c2b0b-67f6-49a0-b686-14486919f888" containerID="f9acb07f63c7f1b062950431ade00628dd70db87f24cd43d00295f50aa56ca78" exitCode=0 Jan 10 16:52:41 crc kubenswrapper[5036]: I0110 16:52:41.935065 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g5dzj" event={"ID":"ca6c2b0b-67f6-49a0-b686-14486919f888","Type":"ContainerDied","Data":"f9acb07f63c7f1b062950431ade00628dd70db87f24cd43d00295f50aa56ca78"} Jan 10 16:52:41 crc kubenswrapper[5036]: I0110 16:52:41.935101 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g5dzj" event={"ID":"ca6c2b0b-67f6-49a0-b686-14486919f888","Type":"ContainerStarted","Data":"600d44a7576ca7de6cc41f1d596ba50e4c6c99e7216f5b17bf8358647598a04f"} Jan 10 16:52:42 crc kubenswrapper[5036]: I0110 16:52:42.944339 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g5dzj" event={"ID":"ca6c2b0b-67f6-49a0-b686-14486919f888","Type":"ContainerStarted","Data":"4a1678ee7d4881b96c3155cfa5d64a93424494c9e44ab9870a13355c6bc891c3"} Jan 10 16:52:43 crc kubenswrapper[5036]: I0110 16:52:43.956755 5036 generic.go:334] "Generic (PLEG): container finished" podID="ca6c2b0b-67f6-49a0-b686-14486919f888" containerID="4a1678ee7d4881b96c3155cfa5d64a93424494c9e44ab9870a13355c6bc891c3" exitCode=0 Jan 10 16:52:43 crc kubenswrapper[5036]: I0110 16:52:43.956824 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g5dzj" event={"ID":"ca6c2b0b-67f6-49a0-b686-14486919f888","Type":"ContainerDied","Data":"4a1678ee7d4881b96c3155cfa5d64a93424494c9e44ab9870a13355c6bc891c3"} Jan 10 16:52:44 crc kubenswrapper[5036]: I0110 16:52:44.966156 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g5dzj" event={"ID":"ca6c2b0b-67f6-49a0-b686-14486919f888","Type":"ContainerStarted","Data":"a36057db9a21e2905de60ac1e7e767293ca9d7ab0d678f6e8296386b45c64d5e"} Jan 10 16:52:44 crc kubenswrapper[5036]: I0110 16:52:44.990875 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g5dzj" podStartSLOduration=2.5712131769999997 podStartE2EDuration="4.990851031s" podCreationTimestamp="2026-01-10 16:52:40 +0000 UTC" firstStartedPulling="2026-01-10 16:52:41.937378676 +0000 UTC m=+1483.807614180" lastFinishedPulling="2026-01-10 16:52:44.35701649 +0000 UTC m=+1486.227252034" observedRunningTime="2026-01-10 16:52:44.982494972 +0000 UTC m=+1486.852730476" watchObservedRunningTime="2026-01-10 16:52:44.990851031 +0000 UTC m=+1486.861086525" Jan 10 16:52:48 crc kubenswrapper[5036]: I0110 16:52:48.411532 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vx7bc"] Jan 10 16:52:48 crc kubenswrapper[5036]: I0110 16:52:48.413606 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vx7bc" Jan 10 16:52:48 crc kubenswrapper[5036]: I0110 16:52:48.426322 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vx7bc"] Jan 10 16:52:48 crc kubenswrapper[5036]: I0110 16:52:48.545948 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phr5v\" (UniqueName: \"kubernetes.io/projected/a78242df-46af-42ee-9cfb-602687075af6-kube-api-access-phr5v\") pod \"redhat-operators-vx7bc\" (UID: \"a78242df-46af-42ee-9cfb-602687075af6\") " pod="openshift-marketplace/redhat-operators-vx7bc" Jan 10 16:52:48 crc kubenswrapper[5036]: I0110 16:52:48.546054 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a78242df-46af-42ee-9cfb-602687075af6-catalog-content\") pod \"redhat-operators-vx7bc\" (UID: \"a78242df-46af-42ee-9cfb-602687075af6\") " pod="openshift-marketplace/redhat-operators-vx7bc" Jan 10 16:52:48 crc kubenswrapper[5036]: I0110 16:52:48.546094 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a78242df-46af-42ee-9cfb-602687075af6-utilities\") pod \"redhat-operators-vx7bc\" (UID: \"a78242df-46af-42ee-9cfb-602687075af6\") " pod="openshift-marketplace/redhat-operators-vx7bc" Jan 10 16:52:48 crc kubenswrapper[5036]: I0110 16:52:48.649949 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a78242df-46af-42ee-9cfb-602687075af6-catalog-content\") pod \"redhat-operators-vx7bc\" (UID: \"a78242df-46af-42ee-9cfb-602687075af6\") " pod="openshift-marketplace/redhat-operators-vx7bc" Jan 10 16:52:48 crc kubenswrapper[5036]: I0110 16:52:48.650045 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a78242df-46af-42ee-9cfb-602687075af6-utilities\") pod \"redhat-operators-vx7bc\" (UID: \"a78242df-46af-42ee-9cfb-602687075af6\") " pod="openshift-marketplace/redhat-operators-vx7bc" Jan 10 16:52:48 crc kubenswrapper[5036]: I0110 16:52:48.650186 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phr5v\" (UniqueName: \"kubernetes.io/projected/a78242df-46af-42ee-9cfb-602687075af6-kube-api-access-phr5v\") pod \"redhat-operators-vx7bc\" (UID: \"a78242df-46af-42ee-9cfb-602687075af6\") " pod="openshift-marketplace/redhat-operators-vx7bc" Jan 10 16:52:48 crc kubenswrapper[5036]: I0110 16:52:48.650692 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a78242df-46af-42ee-9cfb-602687075af6-catalog-content\") pod \"redhat-operators-vx7bc\" (UID: \"a78242df-46af-42ee-9cfb-602687075af6\") " pod="openshift-marketplace/redhat-operators-vx7bc" Jan 10 16:52:48 crc kubenswrapper[5036]: I0110 16:52:48.651416 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a78242df-46af-42ee-9cfb-602687075af6-utilities\") pod \"redhat-operators-vx7bc\" (UID: \"a78242df-46af-42ee-9cfb-602687075af6\") " pod="openshift-marketplace/redhat-operators-vx7bc" Jan 10 16:52:48 crc kubenswrapper[5036]: I0110 16:52:48.676230 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phr5v\" (UniqueName: \"kubernetes.io/projected/a78242df-46af-42ee-9cfb-602687075af6-kube-api-access-phr5v\") pod \"redhat-operators-vx7bc\" (UID: \"a78242df-46af-42ee-9cfb-602687075af6\") " pod="openshift-marketplace/redhat-operators-vx7bc" Jan 10 16:52:48 crc kubenswrapper[5036]: I0110 16:52:48.737140 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vx7bc" Jan 10 16:52:49 crc kubenswrapper[5036]: I0110 16:52:49.187732 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vx7bc"] Jan 10 16:52:50 crc kubenswrapper[5036]: I0110 16:52:50.019038 5036 generic.go:334] "Generic (PLEG): container finished" podID="a78242df-46af-42ee-9cfb-602687075af6" containerID="00de5fb1df046958d03f20739d2e57334157f85078dace0464f275fd9db30337" exitCode=0 Jan 10 16:52:50 crc kubenswrapper[5036]: I0110 16:52:50.019415 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vx7bc" event={"ID":"a78242df-46af-42ee-9cfb-602687075af6","Type":"ContainerDied","Data":"00de5fb1df046958d03f20739d2e57334157f85078dace0464f275fd9db30337"} Jan 10 16:52:50 crc kubenswrapper[5036]: I0110 16:52:50.019457 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vx7bc" event={"ID":"a78242df-46af-42ee-9cfb-602687075af6","Type":"ContainerStarted","Data":"141b5af4248562eb5450142b9444d8c3a5ee168706a60553c89d684fca4d9eb5"} Jan 10 16:52:50 crc kubenswrapper[5036]: I0110 16:52:50.544968 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g5dzj" Jan 10 16:52:50 crc kubenswrapper[5036]: I0110 16:52:50.545349 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g5dzj" Jan 10 16:52:50 crc kubenswrapper[5036]: I0110 16:52:50.589607 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g5dzj" Jan 10 16:52:51 crc kubenswrapper[5036]: I0110 16:52:51.031499 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vx7bc" event={"ID":"a78242df-46af-42ee-9cfb-602687075af6","Type":"ContainerStarted","Data":"d5499ce2772a27ca6a53b524fcb3c718beb548010f0503af8a87667abfa5a163"} Jan 10 16:52:51 crc kubenswrapper[5036]: I0110 16:52:51.084156 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g5dzj" Jan 10 16:52:52 crc kubenswrapper[5036]: I0110 16:52:52.992380 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g5dzj"] Jan 10 16:52:53 crc kubenswrapper[5036]: I0110 16:52:53.053993 5036 generic.go:334] "Generic (PLEG): container finished" podID="a78242df-46af-42ee-9cfb-602687075af6" containerID="d5499ce2772a27ca6a53b524fcb3c718beb548010f0503af8a87667abfa5a163" exitCode=0 Jan 10 16:52:53 crc kubenswrapper[5036]: I0110 16:52:53.054042 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vx7bc" event={"ID":"a78242df-46af-42ee-9cfb-602687075af6","Type":"ContainerDied","Data":"d5499ce2772a27ca6a53b524fcb3c718beb548010f0503af8a87667abfa5a163"} Jan 10 16:52:53 crc kubenswrapper[5036]: I0110 16:52:53.054234 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g5dzj" podUID="ca6c2b0b-67f6-49a0-b686-14486919f888" containerName="registry-server" containerID="cri-o://a36057db9a21e2905de60ac1e7e767293ca9d7ab0d678f6e8296386b45c64d5e" gracePeriod=2 Jan 10 16:52:53 crc kubenswrapper[5036]: I0110 16:52:53.499324 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g5dzj" Jan 10 16:52:53 crc kubenswrapper[5036]: I0110 16:52:53.544326 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njcxc\" (UniqueName: \"kubernetes.io/projected/ca6c2b0b-67f6-49a0-b686-14486919f888-kube-api-access-njcxc\") pod \"ca6c2b0b-67f6-49a0-b686-14486919f888\" (UID: \"ca6c2b0b-67f6-49a0-b686-14486919f888\") " Jan 10 16:52:53 crc kubenswrapper[5036]: I0110 16:52:53.544712 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca6c2b0b-67f6-49a0-b686-14486919f888-catalog-content\") pod \"ca6c2b0b-67f6-49a0-b686-14486919f888\" (UID: \"ca6c2b0b-67f6-49a0-b686-14486919f888\") " Jan 10 16:52:53 crc kubenswrapper[5036]: I0110 16:52:53.544922 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca6c2b0b-67f6-49a0-b686-14486919f888-utilities\") pod \"ca6c2b0b-67f6-49a0-b686-14486919f888\" (UID: \"ca6c2b0b-67f6-49a0-b686-14486919f888\") " Jan 10 16:52:53 crc kubenswrapper[5036]: I0110 16:52:53.546047 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca6c2b0b-67f6-49a0-b686-14486919f888-utilities" (OuterVolumeSpecName: "utilities") pod "ca6c2b0b-67f6-49a0-b686-14486919f888" (UID: "ca6c2b0b-67f6-49a0-b686-14486919f888"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:52:53 crc kubenswrapper[5036]: I0110 16:52:53.562583 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca6c2b0b-67f6-49a0-b686-14486919f888-kube-api-access-njcxc" (OuterVolumeSpecName: "kube-api-access-njcxc") pod "ca6c2b0b-67f6-49a0-b686-14486919f888" (UID: "ca6c2b0b-67f6-49a0-b686-14486919f888"). InnerVolumeSpecName "kube-api-access-njcxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:52:53 crc kubenswrapper[5036]: I0110 16:52:53.566930 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca6c2b0b-67f6-49a0-b686-14486919f888-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca6c2b0b-67f6-49a0-b686-14486919f888" (UID: "ca6c2b0b-67f6-49a0-b686-14486919f888"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:52:53 crc kubenswrapper[5036]: I0110 16:52:53.647222 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca6c2b0b-67f6-49a0-b686-14486919f888-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 16:52:53 crc kubenswrapper[5036]: I0110 16:52:53.647456 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njcxc\" (UniqueName: \"kubernetes.io/projected/ca6c2b0b-67f6-49a0-b686-14486919f888-kube-api-access-njcxc\") on node \"crc\" DevicePath \"\"" Jan 10 16:52:53 crc kubenswrapper[5036]: I0110 16:52:53.647468 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca6c2b0b-67f6-49a0-b686-14486919f888-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 16:52:54 crc kubenswrapper[5036]: I0110 16:52:54.065785 5036 generic.go:334] "Generic (PLEG): container finished" podID="ca6c2b0b-67f6-49a0-b686-14486919f888" containerID="a36057db9a21e2905de60ac1e7e767293ca9d7ab0d678f6e8296386b45c64d5e" exitCode=0 Jan 10 16:52:54 crc kubenswrapper[5036]: I0110 16:52:54.065915 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g5dzj" Jan 10 16:52:54 crc kubenswrapper[5036]: I0110 16:52:54.066797 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g5dzj" event={"ID":"ca6c2b0b-67f6-49a0-b686-14486919f888","Type":"ContainerDied","Data":"a36057db9a21e2905de60ac1e7e767293ca9d7ab0d678f6e8296386b45c64d5e"} Jan 10 16:52:54 crc kubenswrapper[5036]: I0110 16:52:54.066854 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g5dzj" event={"ID":"ca6c2b0b-67f6-49a0-b686-14486919f888","Type":"ContainerDied","Data":"600d44a7576ca7de6cc41f1d596ba50e4c6c99e7216f5b17bf8358647598a04f"} Jan 10 16:52:54 crc kubenswrapper[5036]: I0110 16:52:54.066879 5036 scope.go:117] "RemoveContainer" containerID="a36057db9a21e2905de60ac1e7e767293ca9d7ab0d678f6e8296386b45c64d5e" Jan 10 16:52:54 crc kubenswrapper[5036]: I0110 16:52:54.069267 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vx7bc" event={"ID":"a78242df-46af-42ee-9cfb-602687075af6","Type":"ContainerStarted","Data":"664d9ca0a3e4566006c64068b56a920eb60762d7cc0b49d6616d0fe9a57801b0"} Jan 10 16:52:54 crc kubenswrapper[5036]: I0110 16:52:54.093630 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vx7bc" podStartSLOduration=2.483379692 podStartE2EDuration="6.093610457s" podCreationTimestamp="2026-01-10 16:52:48 +0000 UTC" firstStartedPulling="2026-01-10 16:52:50.022196986 +0000 UTC m=+1491.892432490" lastFinishedPulling="2026-01-10 16:52:53.632427761 +0000 UTC m=+1495.502663255" observedRunningTime="2026-01-10 16:52:54.093150024 +0000 UTC m=+1495.963385528" watchObservedRunningTime="2026-01-10 16:52:54.093610457 +0000 UTC m=+1495.963845951" Jan 10 16:52:54 crc kubenswrapper[5036]: I0110 16:52:54.095744 5036 scope.go:117] "RemoveContainer" containerID="4a1678ee7d4881b96c3155cfa5d64a93424494c9e44ab9870a13355c6bc891c3" Jan 10 16:52:54 crc kubenswrapper[5036]: I0110 16:52:54.112264 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g5dzj"] Jan 10 16:52:54 crc kubenswrapper[5036]: I0110 16:52:54.119138 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g5dzj"] Jan 10 16:52:54 crc kubenswrapper[5036]: I0110 16:52:54.137012 5036 scope.go:117] "RemoveContainer" containerID="f9acb07f63c7f1b062950431ade00628dd70db87f24cd43d00295f50aa56ca78" Jan 10 16:52:54 crc kubenswrapper[5036]: I0110 16:52:54.155592 5036 scope.go:117] "RemoveContainer" containerID="a36057db9a21e2905de60ac1e7e767293ca9d7ab0d678f6e8296386b45c64d5e" Jan 10 16:52:54 crc kubenswrapper[5036]: E0110 16:52:54.156146 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a36057db9a21e2905de60ac1e7e767293ca9d7ab0d678f6e8296386b45c64d5e\": container with ID starting with a36057db9a21e2905de60ac1e7e767293ca9d7ab0d678f6e8296386b45c64d5e not found: ID does not exist" containerID="a36057db9a21e2905de60ac1e7e767293ca9d7ab0d678f6e8296386b45c64d5e" Jan 10 16:52:54 crc kubenswrapper[5036]: I0110 16:52:54.156195 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a36057db9a21e2905de60ac1e7e767293ca9d7ab0d678f6e8296386b45c64d5e"} err="failed to get container status \"a36057db9a21e2905de60ac1e7e767293ca9d7ab0d678f6e8296386b45c64d5e\": rpc error: code = NotFound desc = could not find container \"a36057db9a21e2905de60ac1e7e767293ca9d7ab0d678f6e8296386b45c64d5e\": container with ID starting with a36057db9a21e2905de60ac1e7e767293ca9d7ab0d678f6e8296386b45c64d5e not found: ID does not exist" Jan 10 16:52:54 crc kubenswrapper[5036]: I0110 16:52:54.156222 5036 scope.go:117] "RemoveContainer" containerID="4a1678ee7d4881b96c3155cfa5d64a93424494c9e44ab9870a13355c6bc891c3" Jan 10 16:52:54 crc kubenswrapper[5036]: E0110 16:52:54.156532 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a1678ee7d4881b96c3155cfa5d64a93424494c9e44ab9870a13355c6bc891c3\": container with ID starting with 4a1678ee7d4881b96c3155cfa5d64a93424494c9e44ab9870a13355c6bc891c3 not found: ID does not exist" containerID="4a1678ee7d4881b96c3155cfa5d64a93424494c9e44ab9870a13355c6bc891c3" Jan 10 16:52:54 crc kubenswrapper[5036]: I0110 16:52:54.156557 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a1678ee7d4881b96c3155cfa5d64a93424494c9e44ab9870a13355c6bc891c3"} err="failed to get container status \"4a1678ee7d4881b96c3155cfa5d64a93424494c9e44ab9870a13355c6bc891c3\": rpc error: code = NotFound desc = could not find container \"4a1678ee7d4881b96c3155cfa5d64a93424494c9e44ab9870a13355c6bc891c3\": container with ID starting with 4a1678ee7d4881b96c3155cfa5d64a93424494c9e44ab9870a13355c6bc891c3 not found: ID does not exist" Jan 10 16:52:54 crc kubenswrapper[5036]: I0110 16:52:54.156573 5036 scope.go:117] "RemoveContainer" containerID="f9acb07f63c7f1b062950431ade00628dd70db87f24cd43d00295f50aa56ca78" Jan 10 16:52:54 crc kubenswrapper[5036]: E0110 16:52:54.157005 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9acb07f63c7f1b062950431ade00628dd70db87f24cd43d00295f50aa56ca78\": container with ID starting with f9acb07f63c7f1b062950431ade00628dd70db87f24cd43d00295f50aa56ca78 not found: ID does not exist" containerID="f9acb07f63c7f1b062950431ade00628dd70db87f24cd43d00295f50aa56ca78" Jan 10 16:52:54 crc kubenswrapper[5036]: I0110 16:52:54.157042 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9acb07f63c7f1b062950431ade00628dd70db87f24cd43d00295f50aa56ca78"} err="failed to get container status \"f9acb07f63c7f1b062950431ade00628dd70db87f24cd43d00295f50aa56ca78\": rpc error: code = NotFound desc = could not find container \"f9acb07f63c7f1b062950431ade00628dd70db87f24cd43d00295f50aa56ca78\": container with ID starting with f9acb07f63c7f1b062950431ade00628dd70db87f24cd43d00295f50aa56ca78 not found: ID does not exist" Jan 10 16:52:54 crc kubenswrapper[5036]: I0110 16:52:54.524130 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca6c2b0b-67f6-49a0-b686-14486919f888" path="/var/lib/kubelet/pods/ca6c2b0b-67f6-49a0-b686-14486919f888/volumes" Jan 10 16:52:58 crc kubenswrapper[5036]: I0110 16:52:58.738172 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vx7bc" Jan 10 16:52:58 crc kubenswrapper[5036]: I0110 16:52:58.738955 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vx7bc" Jan 10 16:52:59 crc kubenswrapper[5036]: I0110 16:52:59.789233 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vx7bc" podUID="a78242df-46af-42ee-9cfb-602687075af6" containerName="registry-server" probeResult="failure" output=< Jan 10 16:52:59 crc kubenswrapper[5036]: timeout: failed to connect service ":50051" within 1s Jan 10 16:52:59 crc kubenswrapper[5036]: > Jan 10 16:53:08 crc kubenswrapper[5036]: I0110 16:53:08.793632 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vx7bc" Jan 10 16:53:08 crc kubenswrapper[5036]: I0110 16:53:08.849999 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vx7bc" Jan 10 16:53:09 crc kubenswrapper[5036]: I0110 16:53:09.036535 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vx7bc"] Jan 10 16:53:10 crc kubenswrapper[5036]: I0110 16:53:10.222046 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vx7bc" podUID="a78242df-46af-42ee-9cfb-602687075af6" containerName="registry-server" containerID="cri-o://664d9ca0a3e4566006c64068b56a920eb60762d7cc0b49d6616d0fe9a57801b0" gracePeriod=2 Jan 10 16:53:10 crc kubenswrapper[5036]: I0110 16:53:10.656632 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vx7bc" Jan 10 16:53:10 crc kubenswrapper[5036]: I0110 16:53:10.660950 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a78242df-46af-42ee-9cfb-602687075af6-utilities\") pod \"a78242df-46af-42ee-9cfb-602687075af6\" (UID: \"a78242df-46af-42ee-9cfb-602687075af6\") " Jan 10 16:53:10 crc kubenswrapper[5036]: I0110 16:53:10.661030 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phr5v\" (UniqueName: \"kubernetes.io/projected/a78242df-46af-42ee-9cfb-602687075af6-kube-api-access-phr5v\") pod \"a78242df-46af-42ee-9cfb-602687075af6\" (UID: \"a78242df-46af-42ee-9cfb-602687075af6\") " Jan 10 16:53:10 crc kubenswrapper[5036]: I0110 16:53:10.661107 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a78242df-46af-42ee-9cfb-602687075af6-catalog-content\") pod \"a78242df-46af-42ee-9cfb-602687075af6\" (UID: \"a78242df-46af-42ee-9cfb-602687075af6\") " Jan 10 16:53:10 crc kubenswrapper[5036]: I0110 16:53:10.661971 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a78242df-46af-42ee-9cfb-602687075af6-utilities" (OuterVolumeSpecName: "utilities") pod "a78242df-46af-42ee-9cfb-602687075af6" (UID: "a78242df-46af-42ee-9cfb-602687075af6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:53:10 crc kubenswrapper[5036]: I0110 16:53:10.668953 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a78242df-46af-42ee-9cfb-602687075af6-kube-api-access-phr5v" (OuterVolumeSpecName: "kube-api-access-phr5v") pod "a78242df-46af-42ee-9cfb-602687075af6" (UID: "a78242df-46af-42ee-9cfb-602687075af6"). InnerVolumeSpecName "kube-api-access-phr5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:53:10 crc kubenswrapper[5036]: I0110 16:53:10.763131 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a78242df-46af-42ee-9cfb-602687075af6-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 16:53:10 crc kubenswrapper[5036]: I0110 16:53:10.763159 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phr5v\" (UniqueName: \"kubernetes.io/projected/a78242df-46af-42ee-9cfb-602687075af6-kube-api-access-phr5v\") on node \"crc\" DevicePath \"\"" Jan 10 16:53:10 crc kubenswrapper[5036]: I0110 16:53:10.845072 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a78242df-46af-42ee-9cfb-602687075af6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a78242df-46af-42ee-9cfb-602687075af6" (UID: "a78242df-46af-42ee-9cfb-602687075af6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:53:10 crc kubenswrapper[5036]: I0110 16:53:10.864660 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a78242df-46af-42ee-9cfb-602687075af6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 16:53:11 crc kubenswrapper[5036]: I0110 16:53:11.125466 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-kmjz4"] Jan 10 16:53:11 crc kubenswrapper[5036]: I0110 16:53:11.132719 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-0f05-account-create-update-vfvzx"] Jan 10 16:53:11 crc kubenswrapper[5036]: I0110 16:53:11.139320 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-2hht5"] Jan 10 16:53:11 crc kubenswrapper[5036]: I0110 16:53:11.147247 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-kmjz4"] Jan 10 16:53:11 crc kubenswrapper[5036]: I0110 16:53:11.156234 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-0f05-account-create-update-vfvzx"] Jan 10 16:53:11 crc kubenswrapper[5036]: I0110 16:53:11.166188 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-2hht5"] Jan 10 16:53:11 crc kubenswrapper[5036]: I0110 16:53:11.232219 5036 generic.go:334] "Generic (PLEG): container finished" podID="a78242df-46af-42ee-9cfb-602687075af6" containerID="664d9ca0a3e4566006c64068b56a920eb60762d7cc0b49d6616d0fe9a57801b0" exitCode=0 Jan 10 16:53:11 crc kubenswrapper[5036]: I0110 16:53:11.232265 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vx7bc" event={"ID":"a78242df-46af-42ee-9cfb-602687075af6","Type":"ContainerDied","Data":"664d9ca0a3e4566006c64068b56a920eb60762d7cc0b49d6616d0fe9a57801b0"} Jan 10 16:53:11 crc kubenswrapper[5036]: I0110 16:53:11.232291 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vx7bc" event={"ID":"a78242df-46af-42ee-9cfb-602687075af6","Type":"ContainerDied","Data":"141b5af4248562eb5450142b9444d8c3a5ee168706a60553c89d684fca4d9eb5"} Jan 10 16:53:11 crc kubenswrapper[5036]: I0110 16:53:11.232311 5036 scope.go:117] "RemoveContainer" containerID="664d9ca0a3e4566006c64068b56a920eb60762d7cc0b49d6616d0fe9a57801b0" Jan 10 16:53:11 crc kubenswrapper[5036]: I0110 16:53:11.232311 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vx7bc" Jan 10 16:53:11 crc kubenswrapper[5036]: I0110 16:53:11.251245 5036 scope.go:117] "RemoveContainer" containerID="d5499ce2772a27ca6a53b524fcb3c718beb548010f0503af8a87667abfa5a163" Jan 10 16:53:11 crc kubenswrapper[5036]: I0110 16:53:11.273808 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vx7bc"] Jan 10 16:53:11 crc kubenswrapper[5036]: I0110 16:53:11.279750 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vx7bc"] Jan 10 16:53:11 crc kubenswrapper[5036]: I0110 16:53:11.293838 5036 scope.go:117] "RemoveContainer" containerID="00de5fb1df046958d03f20739d2e57334157f85078dace0464f275fd9db30337" Jan 10 16:53:11 crc kubenswrapper[5036]: I0110 16:53:11.330569 5036 scope.go:117] "RemoveContainer" containerID="664d9ca0a3e4566006c64068b56a920eb60762d7cc0b49d6616d0fe9a57801b0" Jan 10 16:53:11 crc kubenswrapper[5036]: E0110 16:53:11.330973 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"664d9ca0a3e4566006c64068b56a920eb60762d7cc0b49d6616d0fe9a57801b0\": container with ID starting with 664d9ca0a3e4566006c64068b56a920eb60762d7cc0b49d6616d0fe9a57801b0 not found: ID does not exist" containerID="664d9ca0a3e4566006c64068b56a920eb60762d7cc0b49d6616d0fe9a57801b0" Jan 10 16:53:11 crc kubenswrapper[5036]: I0110 16:53:11.331033 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"664d9ca0a3e4566006c64068b56a920eb60762d7cc0b49d6616d0fe9a57801b0"} err="failed to get container status \"664d9ca0a3e4566006c64068b56a920eb60762d7cc0b49d6616d0fe9a57801b0\": rpc error: code = NotFound desc = could not find container \"664d9ca0a3e4566006c64068b56a920eb60762d7cc0b49d6616d0fe9a57801b0\": container with ID starting with 664d9ca0a3e4566006c64068b56a920eb60762d7cc0b49d6616d0fe9a57801b0 not found: ID does not exist" Jan 10 16:53:11 crc kubenswrapper[5036]: I0110 16:53:11.331066 5036 scope.go:117] "RemoveContainer" containerID="d5499ce2772a27ca6a53b524fcb3c718beb548010f0503af8a87667abfa5a163" Jan 10 16:53:11 crc kubenswrapper[5036]: E0110 16:53:11.331347 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5499ce2772a27ca6a53b524fcb3c718beb548010f0503af8a87667abfa5a163\": container with ID starting with d5499ce2772a27ca6a53b524fcb3c718beb548010f0503af8a87667abfa5a163 not found: ID does not exist" containerID="d5499ce2772a27ca6a53b524fcb3c718beb548010f0503af8a87667abfa5a163" Jan 10 16:53:11 crc kubenswrapper[5036]: I0110 16:53:11.331370 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5499ce2772a27ca6a53b524fcb3c718beb548010f0503af8a87667abfa5a163"} err="failed to get container status \"d5499ce2772a27ca6a53b524fcb3c718beb548010f0503af8a87667abfa5a163\": rpc error: code = NotFound desc = could not find container \"d5499ce2772a27ca6a53b524fcb3c718beb548010f0503af8a87667abfa5a163\": container with ID starting with d5499ce2772a27ca6a53b524fcb3c718beb548010f0503af8a87667abfa5a163 not found: ID does not exist" Jan 10 16:53:11 crc kubenswrapper[5036]: I0110 16:53:11.331384 5036 scope.go:117] "RemoveContainer" containerID="00de5fb1df046958d03f20739d2e57334157f85078dace0464f275fd9db30337" Jan 10 16:53:11 crc kubenswrapper[5036]: E0110 16:53:11.331644 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00de5fb1df046958d03f20739d2e57334157f85078dace0464f275fd9db30337\": container with ID starting with 00de5fb1df046958d03f20739d2e57334157f85078dace0464f275fd9db30337 not found: ID does not exist" containerID="00de5fb1df046958d03f20739d2e57334157f85078dace0464f275fd9db30337" Jan 10 16:53:11 crc kubenswrapper[5036]: I0110 16:53:11.331736 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00de5fb1df046958d03f20739d2e57334157f85078dace0464f275fd9db30337"} err="failed to get container status \"00de5fb1df046958d03f20739d2e57334157f85078dace0464f275fd9db30337\": rpc error: code = NotFound desc = could not find container \"00de5fb1df046958d03f20739d2e57334157f85078dace0464f275fd9db30337\": container with ID starting with 00de5fb1df046958d03f20739d2e57334157f85078dace0464f275fd9db30337 not found: ID does not exist" Jan 10 16:53:12 crc kubenswrapper[5036]: I0110 16:53:12.031385 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-67ce-account-create-update-t7sxt"] Jan 10 16:53:12 crc kubenswrapper[5036]: I0110 16:53:12.043492 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-67ce-account-create-update-t7sxt"] Jan 10 16:53:12 crc kubenswrapper[5036]: I0110 16:53:12.052590 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-bzzkq"] Jan 10 16:53:12 crc kubenswrapper[5036]: I0110 16:53:12.060147 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-813f-account-create-update-m5fwq"] Jan 10 16:53:12 crc kubenswrapper[5036]: I0110 16:53:12.066793 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-bzzkq"] Jan 10 16:53:12 crc kubenswrapper[5036]: I0110 16:53:12.073536 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-813f-account-create-update-m5fwq"] Jan 10 16:53:12 crc kubenswrapper[5036]: I0110 16:53:12.519265 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c0cdec2-be1f-4169-ade6-cf65905c7003" path="/var/lib/kubelet/pods/4c0cdec2-be1f-4169-ade6-cf65905c7003/volumes" Jan 10 16:53:12 crc kubenswrapper[5036]: I0110 16:53:12.519921 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b0a63f9-1828-482a-a1f1-d99bf4b8932e" path="/var/lib/kubelet/pods/8b0a63f9-1828-482a-a1f1-d99bf4b8932e/volumes" Jan 10 16:53:12 crc kubenswrapper[5036]: I0110 16:53:12.520561 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90b5d891-59ef-43ca-9689-ec2f4bfa590c" path="/var/lib/kubelet/pods/90b5d891-59ef-43ca-9689-ec2f4bfa590c/volumes" Jan 10 16:53:12 crc kubenswrapper[5036]: I0110 16:53:12.521144 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a78242df-46af-42ee-9cfb-602687075af6" path="/var/lib/kubelet/pods/a78242df-46af-42ee-9cfb-602687075af6/volumes" Jan 10 16:53:12 crc kubenswrapper[5036]: I0110 16:53:12.522437 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62" path="/var/lib/kubelet/pods/aeb3f8c5-2ecf-47d7-9ef3-4550ed574b62/volumes" Jan 10 16:53:12 crc kubenswrapper[5036]: I0110 16:53:12.523121 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d206d6b1-be89-44b7-a4db-749bd0113be2" path="/var/lib/kubelet/pods/d206d6b1-be89-44b7-a4db-749bd0113be2/volumes" Jan 10 16:53:12 crc kubenswrapper[5036]: I0110 16:53:12.523669 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edca4de3-92c0-449a-a081-8868ada61ff8" path="/var/lib/kubelet/pods/edca4de3-92c0-449a-a081-8868ada61ff8/volumes" Jan 10 16:53:16 crc kubenswrapper[5036]: I0110 16:53:16.287116 5036 generic.go:334] "Generic (PLEG): container finished" podID="3c5a7464-a4c9-4aed-a25f-1d19266239b4" containerID="154ea3fa105911d8e7aa267185cf51e57c2a00885b21fae621da532db74454ea" exitCode=0 Jan 10 16:53:16 crc kubenswrapper[5036]: I0110 16:53:16.287360 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz" event={"ID":"3c5a7464-a4c9-4aed-a25f-1d19266239b4","Type":"ContainerDied","Data":"154ea3fa105911d8e7aa267185cf51e57c2a00885b21fae621da532db74454ea"} Jan 10 16:53:17 crc kubenswrapper[5036]: I0110 16:53:17.803130 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz" Jan 10 16:53:17 crc kubenswrapper[5036]: I0110 16:53:17.854220 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c5a7464-a4c9-4aed-a25f-1d19266239b4-inventory\") pod \"3c5a7464-a4c9-4aed-a25f-1d19266239b4\" (UID: \"3c5a7464-a4c9-4aed-a25f-1d19266239b4\") " Jan 10 16:53:17 crc kubenswrapper[5036]: I0110 16:53:17.855035 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4725v\" (UniqueName: \"kubernetes.io/projected/3c5a7464-a4c9-4aed-a25f-1d19266239b4-kube-api-access-4725v\") pod \"3c5a7464-a4c9-4aed-a25f-1d19266239b4\" (UID: \"3c5a7464-a4c9-4aed-a25f-1d19266239b4\") " Jan 10 16:53:17 crc kubenswrapper[5036]: I0110 16:53:17.855161 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c5a7464-a4c9-4aed-a25f-1d19266239b4-ssh-key-openstack-edpm-ipam\") pod \"3c5a7464-a4c9-4aed-a25f-1d19266239b4\" (UID: \"3c5a7464-a4c9-4aed-a25f-1d19266239b4\") " Jan 10 16:53:17 crc kubenswrapper[5036]: I0110 16:53:17.863487 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c5a7464-a4c9-4aed-a25f-1d19266239b4-kube-api-access-4725v" (OuterVolumeSpecName: "kube-api-access-4725v") pod "3c5a7464-a4c9-4aed-a25f-1d19266239b4" (UID: "3c5a7464-a4c9-4aed-a25f-1d19266239b4"). InnerVolumeSpecName "kube-api-access-4725v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:53:17 crc kubenswrapper[5036]: I0110 16:53:17.906885 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c5a7464-a4c9-4aed-a25f-1d19266239b4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3c5a7464-a4c9-4aed-a25f-1d19266239b4" (UID: "3c5a7464-a4c9-4aed-a25f-1d19266239b4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:53:17 crc kubenswrapper[5036]: I0110 16:53:17.910769 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c5a7464-a4c9-4aed-a25f-1d19266239b4-inventory" (OuterVolumeSpecName: "inventory") pod "3c5a7464-a4c9-4aed-a25f-1d19266239b4" (UID: "3c5a7464-a4c9-4aed-a25f-1d19266239b4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:53:17 crc kubenswrapper[5036]: I0110 16:53:17.957966 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4725v\" (UniqueName: \"kubernetes.io/projected/3c5a7464-a4c9-4aed-a25f-1d19266239b4-kube-api-access-4725v\") on node \"crc\" DevicePath \"\"" Jan 10 16:53:17 crc kubenswrapper[5036]: I0110 16:53:17.958404 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3c5a7464-a4c9-4aed-a25f-1d19266239b4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 16:53:17 crc kubenswrapper[5036]: I0110 16:53:17.958530 5036 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3c5a7464-a4c9-4aed-a25f-1d19266239b4-inventory\") on node \"crc\" DevicePath \"\"" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.328971 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz" event={"ID":"3c5a7464-a4c9-4aed-a25f-1d19266239b4","Type":"ContainerDied","Data":"8927c064114a6a1e76419dd25fde3834f922560c14358899b6dfb958649fb164"} Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.329026 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8927c064114a6a1e76419dd25fde3834f922560c14358899b6dfb958649fb164" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.329086 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.432471 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2"] Jan 10 16:53:18 crc kubenswrapper[5036]: E0110 16:53:18.432823 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca6c2b0b-67f6-49a0-b686-14486919f888" containerName="registry-server" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.432834 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca6c2b0b-67f6-49a0-b686-14486919f888" containerName="registry-server" Jan 10 16:53:18 crc kubenswrapper[5036]: E0110 16:53:18.432845 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a78242df-46af-42ee-9cfb-602687075af6" containerName="extract-content" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.432851 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="a78242df-46af-42ee-9cfb-602687075af6" containerName="extract-content" Jan 10 16:53:18 crc kubenswrapper[5036]: E0110 16:53:18.432869 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca6c2b0b-67f6-49a0-b686-14486919f888" containerName="extract-utilities" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.432875 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca6c2b0b-67f6-49a0-b686-14486919f888" containerName="extract-utilities" Jan 10 16:53:18 crc kubenswrapper[5036]: E0110 16:53:18.432885 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a78242df-46af-42ee-9cfb-602687075af6" containerName="extract-utilities" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.432891 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="a78242df-46af-42ee-9cfb-602687075af6" containerName="extract-utilities" Jan 10 16:53:18 crc kubenswrapper[5036]: E0110 16:53:18.432900 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a78242df-46af-42ee-9cfb-602687075af6" containerName="registry-server" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.432905 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="a78242df-46af-42ee-9cfb-602687075af6" containerName="registry-server" Jan 10 16:53:18 crc kubenswrapper[5036]: E0110 16:53:18.432915 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5a7464-a4c9-4aed-a25f-1d19266239b4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.432922 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5a7464-a4c9-4aed-a25f-1d19266239b4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 10 16:53:18 crc kubenswrapper[5036]: E0110 16:53:18.432935 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca6c2b0b-67f6-49a0-b686-14486919f888" containerName="extract-content" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.432940 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca6c2b0b-67f6-49a0-b686-14486919f888" containerName="extract-content" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.433106 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="a78242df-46af-42ee-9cfb-602687075af6" containerName="registry-server" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.433153 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c5a7464-a4c9-4aed-a25f-1d19266239b4" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.433172 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca6c2b0b-67f6-49a0-b686-14486919f888" containerName="registry-server" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.433741 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.439913 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2"] Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.466941 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.466941 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.467358 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.467502 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.576786 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51f2b92e-a982-4894-8f90-b500b35e1016-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2\" (UID: \"51f2b92e-a982-4894-8f90-b500b35e1016\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.577306 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/51f2b92e-a982-4894-8f90-b500b35e1016-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2\" (UID: \"51f2b92e-a982-4894-8f90-b500b35e1016\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.577352 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftfbv\" (UniqueName: \"kubernetes.io/projected/51f2b92e-a982-4894-8f90-b500b35e1016-kube-api-access-ftfbv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2\" (UID: \"51f2b92e-a982-4894-8f90-b500b35e1016\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.679127 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/51f2b92e-a982-4894-8f90-b500b35e1016-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2\" (UID: \"51f2b92e-a982-4894-8f90-b500b35e1016\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.679191 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftfbv\" (UniqueName: \"kubernetes.io/projected/51f2b92e-a982-4894-8f90-b500b35e1016-kube-api-access-ftfbv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2\" (UID: \"51f2b92e-a982-4894-8f90-b500b35e1016\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.679241 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51f2b92e-a982-4894-8f90-b500b35e1016-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2\" (UID: \"51f2b92e-a982-4894-8f90-b500b35e1016\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.684469 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51f2b92e-a982-4894-8f90-b500b35e1016-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2\" (UID: \"51f2b92e-a982-4894-8f90-b500b35e1016\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.684579 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/51f2b92e-a982-4894-8f90-b500b35e1016-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2\" (UID: \"51f2b92e-a982-4894-8f90-b500b35e1016\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.696225 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftfbv\" (UniqueName: \"kubernetes.io/projected/51f2b92e-a982-4894-8f90-b500b35e1016-kube-api-access-ftfbv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2\" (UID: \"51f2b92e-a982-4894-8f90-b500b35e1016\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2" Jan 10 16:53:18 crc kubenswrapper[5036]: I0110 16:53:18.785352 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2" Jan 10 16:53:19 crc kubenswrapper[5036]: I0110 16:53:19.299028 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2"] Jan 10 16:53:19 crc kubenswrapper[5036]: I0110 16:53:19.307601 5036 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 10 16:53:19 crc kubenswrapper[5036]: I0110 16:53:19.342357 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2" event={"ID":"51f2b92e-a982-4894-8f90-b500b35e1016","Type":"ContainerStarted","Data":"fd1a1215addf39cece372e27733e64c98bfa49507d56ac54a58b5550681a270c"} Jan 10 16:53:20 crc kubenswrapper[5036]: I0110 16:53:20.352972 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2" event={"ID":"51f2b92e-a982-4894-8f90-b500b35e1016","Type":"ContainerStarted","Data":"cdb712ec82a0d71a991537a44993a692f3d0fe5eb126b85791d1ad8bdcbd22da"} Jan 10 16:53:20 crc kubenswrapper[5036]: I0110 16:53:20.735671 5036 scope.go:117] "RemoveContainer" containerID="373c35ab5e490fef737c84e596ab3610712a07ef15c8f5f8d1cf213c02dbb2d2" Jan 10 16:53:20 crc kubenswrapper[5036]: I0110 16:53:20.761772 5036 scope.go:117] "RemoveContainer" containerID="ee74419f111dba9520df1b18aafc859438c57e7070c97f35ec1de30560dddddd" Jan 10 16:53:20 crc kubenswrapper[5036]: I0110 16:53:20.831322 5036 scope.go:117] "RemoveContainer" containerID="50a713c8fd8d05f221fc8976bf3b1bb4574a5a58edb623c6b0ac8033165cbd5e" Jan 10 16:53:20 crc kubenswrapper[5036]: I0110 16:53:20.864347 5036 scope.go:117] "RemoveContainer" containerID="551ef618555643988bdae04127251fe941ba1ffb152c11e816b8bcd85bdf42f4" Jan 10 16:53:20 crc kubenswrapper[5036]: I0110 16:53:20.882356 5036 scope.go:117] "RemoveContainer" containerID="e8a700359f53490edb5a795d09be5fdbb709b849b02a9de930eb78d0f9537c7e" Jan 10 16:53:20 crc kubenswrapper[5036]: I0110 16:53:20.900606 5036 scope.go:117] "RemoveContainer" containerID="461ecf8322edf244c214d8b7efa3681c532d9083bb446e80b059a7e71741f1f8" Jan 10 16:53:23 crc kubenswrapper[5036]: I0110 16:53:23.049313 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2" podStartSLOduration=4.398095953 podStartE2EDuration="5.049284994s" podCreationTimestamp="2026-01-10 16:53:18 +0000 UTC" firstStartedPulling="2026-01-10 16:53:19.307340889 +0000 UTC m=+1521.177576383" lastFinishedPulling="2026-01-10 16:53:19.95852989 +0000 UTC m=+1521.828765424" observedRunningTime="2026-01-10 16:53:20.38307675 +0000 UTC m=+1522.253312244" watchObservedRunningTime="2026-01-10 16:53:23.049284994 +0000 UTC m=+1524.919520478" Jan 10 16:53:23 crc kubenswrapper[5036]: I0110 16:53:23.052635 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-5msnc"] Jan 10 16:53:23 crc kubenswrapper[5036]: I0110 16:53:23.062078 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-5msnc"] Jan 10 16:53:24 crc kubenswrapper[5036]: I0110 16:53:24.521979 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7c7eb4b-3f80-4f63-812b-7001e40c872f" path="/var/lib/kubelet/pods/d7c7eb4b-3f80-4f63-812b-7001e40c872f/volumes" Jan 10 16:53:25 crc kubenswrapper[5036]: I0110 16:53:25.448158 5036 generic.go:334] "Generic (PLEG): container finished" podID="51f2b92e-a982-4894-8f90-b500b35e1016" containerID="cdb712ec82a0d71a991537a44993a692f3d0fe5eb126b85791d1ad8bdcbd22da" exitCode=0 Jan 10 16:53:25 crc kubenswrapper[5036]: I0110 16:53:25.448200 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2" event={"ID":"51f2b92e-a982-4894-8f90-b500b35e1016","Type":"ContainerDied","Data":"cdb712ec82a0d71a991537a44993a692f3d0fe5eb126b85791d1ad8bdcbd22da"} Jan 10 16:53:26 crc kubenswrapper[5036]: I0110 16:53:26.843474 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.030526 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftfbv\" (UniqueName: \"kubernetes.io/projected/51f2b92e-a982-4894-8f90-b500b35e1016-kube-api-access-ftfbv\") pod \"51f2b92e-a982-4894-8f90-b500b35e1016\" (UID: \"51f2b92e-a982-4894-8f90-b500b35e1016\") " Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.030718 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/51f2b92e-a982-4894-8f90-b500b35e1016-ssh-key-openstack-edpm-ipam\") pod \"51f2b92e-a982-4894-8f90-b500b35e1016\" (UID: \"51f2b92e-a982-4894-8f90-b500b35e1016\") " Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.030824 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51f2b92e-a982-4894-8f90-b500b35e1016-inventory\") pod \"51f2b92e-a982-4894-8f90-b500b35e1016\" (UID: \"51f2b92e-a982-4894-8f90-b500b35e1016\") " Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.043037 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51f2b92e-a982-4894-8f90-b500b35e1016-kube-api-access-ftfbv" (OuterVolumeSpecName: "kube-api-access-ftfbv") pod "51f2b92e-a982-4894-8f90-b500b35e1016" (UID: "51f2b92e-a982-4894-8f90-b500b35e1016"). InnerVolumeSpecName "kube-api-access-ftfbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.065473 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51f2b92e-a982-4894-8f90-b500b35e1016-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "51f2b92e-a982-4894-8f90-b500b35e1016" (UID: "51f2b92e-a982-4894-8f90-b500b35e1016"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.074517 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51f2b92e-a982-4894-8f90-b500b35e1016-inventory" (OuterVolumeSpecName: "inventory") pod "51f2b92e-a982-4894-8f90-b500b35e1016" (UID: "51f2b92e-a982-4894-8f90-b500b35e1016"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.133361 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftfbv\" (UniqueName: \"kubernetes.io/projected/51f2b92e-a982-4894-8f90-b500b35e1016-kube-api-access-ftfbv\") on node \"crc\" DevicePath \"\"" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.133396 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/51f2b92e-a982-4894-8f90-b500b35e1016-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.133441 5036 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51f2b92e-a982-4894-8f90-b500b35e1016-inventory\") on node \"crc\" DevicePath \"\"" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.466822 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2" event={"ID":"51f2b92e-a982-4894-8f90-b500b35e1016","Type":"ContainerDied","Data":"fd1a1215addf39cece372e27733e64c98bfa49507d56ac54a58b5550681a270c"} Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.466866 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd1a1215addf39cece372e27733e64c98bfa49507d56ac54a58b5550681a270c" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.466930 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.551265 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hf724"] Jan 10 16:53:27 crc kubenswrapper[5036]: E0110 16:53:27.551696 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51f2b92e-a982-4894-8f90-b500b35e1016" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.551717 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f2b92e-a982-4894-8f90-b500b35e1016" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.551951 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="51f2b92e-a982-4894-8f90-b500b35e1016" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.552637 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hf724" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.557194 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.557381 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.557652 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.558243 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.565829 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hf724"] Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.745539 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgqpg\" (UniqueName: \"kubernetes.io/projected/3978931b-044e-4415-9e0b-5f16130dccc6-kube-api-access-bgqpg\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hf724\" (UID: \"3978931b-044e-4415-9e0b-5f16130dccc6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hf724" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.745933 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3978931b-044e-4415-9e0b-5f16130dccc6-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hf724\" (UID: \"3978931b-044e-4415-9e0b-5f16130dccc6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hf724" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.746103 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3978931b-044e-4415-9e0b-5f16130dccc6-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hf724\" (UID: \"3978931b-044e-4415-9e0b-5f16130dccc6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hf724" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.847243 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgqpg\" (UniqueName: \"kubernetes.io/projected/3978931b-044e-4415-9e0b-5f16130dccc6-kube-api-access-bgqpg\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hf724\" (UID: \"3978931b-044e-4415-9e0b-5f16130dccc6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hf724" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.848357 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3978931b-044e-4415-9e0b-5f16130dccc6-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hf724\" (UID: \"3978931b-044e-4415-9e0b-5f16130dccc6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hf724" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.848480 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3978931b-044e-4415-9e0b-5f16130dccc6-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hf724\" (UID: \"3978931b-044e-4415-9e0b-5f16130dccc6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hf724" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.853480 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3978931b-044e-4415-9e0b-5f16130dccc6-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hf724\" (UID: \"3978931b-044e-4415-9e0b-5f16130dccc6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hf724" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.853507 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3978931b-044e-4415-9e0b-5f16130dccc6-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hf724\" (UID: \"3978931b-044e-4415-9e0b-5f16130dccc6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hf724" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.875313 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgqpg\" (UniqueName: \"kubernetes.io/projected/3978931b-044e-4415-9e0b-5f16130dccc6-kube-api-access-bgqpg\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hf724\" (UID: \"3978931b-044e-4415-9e0b-5f16130dccc6\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hf724" Jan 10 16:53:27 crc kubenswrapper[5036]: I0110 16:53:27.888606 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hf724" Jan 10 16:53:28 crc kubenswrapper[5036]: I0110 16:53:28.543675 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hf724"] Jan 10 16:53:29 crc kubenswrapper[5036]: I0110 16:53:29.484812 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hf724" event={"ID":"3978931b-044e-4415-9e0b-5f16130dccc6","Type":"ContainerStarted","Data":"f24e224e180ff8eb956a58588ad6c4458e422e67846dbcee935971b1749e185f"} Jan 10 16:53:29 crc kubenswrapper[5036]: I0110 16:53:29.485157 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hf724" event={"ID":"3978931b-044e-4415-9e0b-5f16130dccc6","Type":"ContainerStarted","Data":"ebafc1c870719acceb3084ad1bd44dbe7cd282a828307962294606d0bd5df519"} Jan 10 16:53:33 crc kubenswrapper[5036]: I0110 16:53:33.879471 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hf724" podStartSLOduration=6.396023902 podStartE2EDuration="6.879435813s" podCreationTimestamp="2026-01-10 16:53:27 +0000 UTC" firstStartedPulling="2026-01-10 16:53:28.542541675 +0000 UTC m=+1530.412777169" lastFinishedPulling="2026-01-10 16:53:29.025953566 +0000 UTC m=+1530.896189080" observedRunningTime="2026-01-10 16:53:29.509299845 +0000 UTC m=+1531.379535379" watchObservedRunningTime="2026-01-10 16:53:33.879435813 +0000 UTC m=+1535.749671337" Jan 10 16:53:33 crc kubenswrapper[5036]: I0110 16:53:33.890819 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wmv5n"] Jan 10 16:53:33 crc kubenswrapper[5036]: I0110 16:53:33.893760 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmv5n" Jan 10 16:53:33 crc kubenswrapper[5036]: I0110 16:53:33.904862 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wmv5n"] Jan 10 16:53:33 crc kubenswrapper[5036]: I0110 16:53:33.980040 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pd9v\" (UniqueName: \"kubernetes.io/projected/62a5215f-b9ce-414c-8b8e-80ec47cacc8c-kube-api-access-6pd9v\") pod \"community-operators-wmv5n\" (UID: \"62a5215f-b9ce-414c-8b8e-80ec47cacc8c\") " pod="openshift-marketplace/community-operators-wmv5n" Jan 10 16:53:33 crc kubenswrapper[5036]: I0110 16:53:33.980154 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62a5215f-b9ce-414c-8b8e-80ec47cacc8c-utilities\") pod \"community-operators-wmv5n\" (UID: \"62a5215f-b9ce-414c-8b8e-80ec47cacc8c\") " pod="openshift-marketplace/community-operators-wmv5n" Jan 10 16:53:33 crc kubenswrapper[5036]: I0110 16:53:33.980233 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62a5215f-b9ce-414c-8b8e-80ec47cacc8c-catalog-content\") pod \"community-operators-wmv5n\" (UID: \"62a5215f-b9ce-414c-8b8e-80ec47cacc8c\") " pod="openshift-marketplace/community-operators-wmv5n" Jan 10 16:53:34 crc kubenswrapper[5036]: I0110 16:53:34.081863 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pd9v\" (UniqueName: \"kubernetes.io/projected/62a5215f-b9ce-414c-8b8e-80ec47cacc8c-kube-api-access-6pd9v\") pod \"community-operators-wmv5n\" (UID: \"62a5215f-b9ce-414c-8b8e-80ec47cacc8c\") " pod="openshift-marketplace/community-operators-wmv5n" Jan 10 16:53:34 crc kubenswrapper[5036]: I0110 16:53:34.081949 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62a5215f-b9ce-414c-8b8e-80ec47cacc8c-utilities\") pod \"community-operators-wmv5n\" (UID: \"62a5215f-b9ce-414c-8b8e-80ec47cacc8c\") " pod="openshift-marketplace/community-operators-wmv5n" Jan 10 16:53:34 crc kubenswrapper[5036]: I0110 16:53:34.082008 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62a5215f-b9ce-414c-8b8e-80ec47cacc8c-catalog-content\") pod \"community-operators-wmv5n\" (UID: \"62a5215f-b9ce-414c-8b8e-80ec47cacc8c\") " pod="openshift-marketplace/community-operators-wmv5n" Jan 10 16:53:34 crc kubenswrapper[5036]: I0110 16:53:34.082382 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62a5215f-b9ce-414c-8b8e-80ec47cacc8c-catalog-content\") pod \"community-operators-wmv5n\" (UID: \"62a5215f-b9ce-414c-8b8e-80ec47cacc8c\") " pod="openshift-marketplace/community-operators-wmv5n" Jan 10 16:53:34 crc kubenswrapper[5036]: I0110 16:53:34.082636 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62a5215f-b9ce-414c-8b8e-80ec47cacc8c-utilities\") pod \"community-operators-wmv5n\" (UID: \"62a5215f-b9ce-414c-8b8e-80ec47cacc8c\") " pod="openshift-marketplace/community-operators-wmv5n" Jan 10 16:53:34 crc kubenswrapper[5036]: I0110 16:53:34.101358 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pd9v\" (UniqueName: \"kubernetes.io/projected/62a5215f-b9ce-414c-8b8e-80ec47cacc8c-kube-api-access-6pd9v\") pod \"community-operators-wmv5n\" (UID: \"62a5215f-b9ce-414c-8b8e-80ec47cacc8c\") " pod="openshift-marketplace/community-operators-wmv5n" Jan 10 16:53:34 crc kubenswrapper[5036]: I0110 16:53:34.233918 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmv5n" Jan 10 16:53:34 crc kubenswrapper[5036]: I0110 16:53:34.817640 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wmv5n"] Jan 10 16:53:35 crc kubenswrapper[5036]: I0110 16:53:35.565972 5036 generic.go:334] "Generic (PLEG): container finished" podID="62a5215f-b9ce-414c-8b8e-80ec47cacc8c" containerID="2f7034153490443e8c5c89aad5807a4745aecde5dd7d98519ae78de7cdc4c5b4" exitCode=0 Jan 10 16:53:35 crc kubenswrapper[5036]: I0110 16:53:35.566027 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmv5n" event={"ID":"62a5215f-b9ce-414c-8b8e-80ec47cacc8c","Type":"ContainerDied","Data":"2f7034153490443e8c5c89aad5807a4745aecde5dd7d98519ae78de7cdc4c5b4"} Jan 10 16:53:35 crc kubenswrapper[5036]: I0110 16:53:35.566064 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmv5n" event={"ID":"62a5215f-b9ce-414c-8b8e-80ec47cacc8c","Type":"ContainerStarted","Data":"68906eb00bfbbeeeb944f5e89db441be7cfcc054cca0bae54e681272bcb8ec2c"} Jan 10 16:53:36 crc kubenswrapper[5036]: I0110 16:53:36.577929 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmv5n" event={"ID":"62a5215f-b9ce-414c-8b8e-80ec47cacc8c","Type":"ContainerStarted","Data":"8a9ffaa16b7c7a467f0ec1f47e215b5eda413f2801bbcc00c166760c00baa731"} Jan 10 16:53:37 crc kubenswrapper[5036]: I0110 16:53:37.597435 5036 generic.go:334] "Generic (PLEG): container finished" podID="62a5215f-b9ce-414c-8b8e-80ec47cacc8c" containerID="8a9ffaa16b7c7a467f0ec1f47e215b5eda413f2801bbcc00c166760c00baa731" exitCode=0 Jan 10 16:53:37 crc kubenswrapper[5036]: I0110 16:53:37.597576 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmv5n" event={"ID":"62a5215f-b9ce-414c-8b8e-80ec47cacc8c","Type":"ContainerDied","Data":"8a9ffaa16b7c7a467f0ec1f47e215b5eda413f2801bbcc00c166760c00baa731"} Jan 10 16:53:38 crc kubenswrapper[5036]: I0110 16:53:38.037740 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-lk484"] Jan 10 16:53:38 crc kubenswrapper[5036]: I0110 16:53:38.078408 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-21a4-account-create-update-qpvgs"] Jan 10 16:53:38 crc kubenswrapper[5036]: I0110 16:53:38.090851 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-lk484"] Jan 10 16:53:38 crc kubenswrapper[5036]: I0110 16:53:38.098237 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-468c-account-create-update-2r8fm"] Jan 10 16:53:38 crc kubenswrapper[5036]: I0110 16:53:38.104841 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-21a4-account-create-update-qpvgs"] Jan 10 16:53:38 crc kubenswrapper[5036]: I0110 16:53:38.111295 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-kmlkm"] Jan 10 16:53:38 crc kubenswrapper[5036]: I0110 16:53:38.128827 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c5a3-account-create-update-htbm6"] Jan 10 16:53:38 crc kubenswrapper[5036]: I0110 16:53:38.136501 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-kmlkm"] Jan 10 16:53:38 crc kubenswrapper[5036]: I0110 16:53:38.151181 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c5a3-account-create-update-htbm6"] Jan 10 16:53:38 crc kubenswrapper[5036]: I0110 16:53:38.159448 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-468c-account-create-update-2r8fm"] Jan 10 16:53:38 crc kubenswrapper[5036]: I0110 16:53:38.518085 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="199f55a8-575e-4b45-add1-ed5d4da32d21" path="/var/lib/kubelet/pods/199f55a8-575e-4b45-add1-ed5d4da32d21/volumes" Jan 10 16:53:38 crc kubenswrapper[5036]: I0110 16:53:38.519108 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="436d1751-fb2f-45ab-a1c8-a64e3f8b628f" path="/var/lib/kubelet/pods/436d1751-fb2f-45ab-a1c8-a64e3f8b628f/volumes" Jan 10 16:53:38 crc kubenswrapper[5036]: I0110 16:53:38.519719 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9374853e-04a4-4903-877b-f725f5066bfc" path="/var/lib/kubelet/pods/9374853e-04a4-4903-877b-f725f5066bfc/volumes" Jan 10 16:53:38 crc kubenswrapper[5036]: I0110 16:53:38.520473 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97027cc1-5cae-4bbf-8b11-5f7103ba4f09" path="/var/lib/kubelet/pods/97027cc1-5cae-4bbf-8b11-5f7103ba4f09/volumes" Jan 10 16:53:38 crc kubenswrapper[5036]: I0110 16:53:38.521567 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd7d3ebc-490f-4fbd-a86d-469b3c7f281c" path="/var/lib/kubelet/pods/cd7d3ebc-490f-4fbd-a86d-469b3c7f281c/volumes" Jan 10 16:53:38 crc kubenswrapper[5036]: I0110 16:53:38.608714 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmv5n" event={"ID":"62a5215f-b9ce-414c-8b8e-80ec47cacc8c","Type":"ContainerStarted","Data":"0c5c3301a4577408f54a0922a0a63167a433bf747ad0ad7bd9b5c05ff7b2b804"} Jan 10 16:53:38 crc kubenswrapper[5036]: I0110 16:53:38.630381 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wmv5n" podStartSLOduration=3.192805974 podStartE2EDuration="5.630364293s" podCreationTimestamp="2026-01-10 16:53:33 +0000 UTC" firstStartedPulling="2026-01-10 16:53:35.568877743 +0000 UTC m=+1537.439113247" lastFinishedPulling="2026-01-10 16:53:38.006436072 +0000 UTC m=+1539.876671566" observedRunningTime="2026-01-10 16:53:38.62643368 +0000 UTC m=+1540.496669184" watchObservedRunningTime="2026-01-10 16:53:38.630364293 +0000 UTC m=+1540.500599787" Jan 10 16:53:39 crc kubenswrapper[5036]: I0110 16:53:39.025257 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-ls5rk"] Jan 10 16:53:39 crc kubenswrapper[5036]: I0110 16:53:39.032131 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-ls5rk"] Jan 10 16:53:40 crc kubenswrapper[5036]: I0110 16:53:40.518796 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d244a3f-4202-469c-a576-b14fb7323180" path="/var/lib/kubelet/pods/9d244a3f-4202-469c-a576-b14fb7323180/volumes" Jan 10 16:53:44 crc kubenswrapper[5036]: I0110 16:53:44.234971 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wmv5n" Jan 10 16:53:44 crc kubenswrapper[5036]: I0110 16:53:44.236246 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wmv5n" Jan 10 16:53:44 crc kubenswrapper[5036]: I0110 16:53:44.294776 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wmv5n" Jan 10 16:53:44 crc kubenswrapper[5036]: I0110 16:53:44.714402 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wmv5n" Jan 10 16:53:44 crc kubenswrapper[5036]: I0110 16:53:44.763652 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wmv5n"] Jan 10 16:53:46 crc kubenswrapper[5036]: I0110 16:53:46.679162 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wmv5n" podUID="62a5215f-b9ce-414c-8b8e-80ec47cacc8c" containerName="registry-server" containerID="cri-o://0c5c3301a4577408f54a0922a0a63167a433bf747ad0ad7bd9b5c05ff7b2b804" gracePeriod=2 Jan 10 16:53:47 crc kubenswrapper[5036]: E0110 16:53:47.067659 5036 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62a5215f_b9ce_414c_8b8e_80ec47cacc8c.slice/crio-conmon-0c5c3301a4577408f54a0922a0a63167a433bf747ad0ad7bd9b5c05ff7b2b804.scope\": RecentStats: unable to find data in memory cache]" Jan 10 16:53:47 crc kubenswrapper[5036]: I0110 16:53:47.688928 5036 generic.go:334] "Generic (PLEG): container finished" podID="62a5215f-b9ce-414c-8b8e-80ec47cacc8c" containerID="0c5c3301a4577408f54a0922a0a63167a433bf747ad0ad7bd9b5c05ff7b2b804" exitCode=0 Jan 10 16:53:47 crc kubenswrapper[5036]: I0110 16:53:47.688972 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmv5n" event={"ID":"62a5215f-b9ce-414c-8b8e-80ec47cacc8c","Type":"ContainerDied","Data":"0c5c3301a4577408f54a0922a0a63167a433bf747ad0ad7bd9b5c05ff7b2b804"} Jan 10 16:53:47 crc kubenswrapper[5036]: I0110 16:53:47.689206 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmv5n" event={"ID":"62a5215f-b9ce-414c-8b8e-80ec47cacc8c","Type":"ContainerDied","Data":"68906eb00bfbbeeeb944f5e89db441be7cfcc054cca0bae54e681272bcb8ec2c"} Jan 10 16:53:47 crc kubenswrapper[5036]: I0110 16:53:47.689221 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68906eb00bfbbeeeb944f5e89db441be7cfcc054cca0bae54e681272bcb8ec2c" Jan 10 16:53:47 crc kubenswrapper[5036]: I0110 16:53:47.689370 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmv5n" Jan 10 16:53:47 crc kubenswrapper[5036]: I0110 16:53:47.854523 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pd9v\" (UniqueName: \"kubernetes.io/projected/62a5215f-b9ce-414c-8b8e-80ec47cacc8c-kube-api-access-6pd9v\") pod \"62a5215f-b9ce-414c-8b8e-80ec47cacc8c\" (UID: \"62a5215f-b9ce-414c-8b8e-80ec47cacc8c\") " Jan 10 16:53:47 crc kubenswrapper[5036]: I0110 16:53:47.855135 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62a5215f-b9ce-414c-8b8e-80ec47cacc8c-utilities\") pod \"62a5215f-b9ce-414c-8b8e-80ec47cacc8c\" (UID: \"62a5215f-b9ce-414c-8b8e-80ec47cacc8c\") " Jan 10 16:53:47 crc kubenswrapper[5036]: I0110 16:53:47.855204 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62a5215f-b9ce-414c-8b8e-80ec47cacc8c-catalog-content\") pod \"62a5215f-b9ce-414c-8b8e-80ec47cacc8c\" (UID: \"62a5215f-b9ce-414c-8b8e-80ec47cacc8c\") " Jan 10 16:53:47 crc kubenswrapper[5036]: I0110 16:53:47.856399 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62a5215f-b9ce-414c-8b8e-80ec47cacc8c-utilities" (OuterVolumeSpecName: "utilities") pod "62a5215f-b9ce-414c-8b8e-80ec47cacc8c" (UID: "62a5215f-b9ce-414c-8b8e-80ec47cacc8c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:53:47 crc kubenswrapper[5036]: I0110 16:53:47.862948 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62a5215f-b9ce-414c-8b8e-80ec47cacc8c-kube-api-access-6pd9v" (OuterVolumeSpecName: "kube-api-access-6pd9v") pod "62a5215f-b9ce-414c-8b8e-80ec47cacc8c" (UID: "62a5215f-b9ce-414c-8b8e-80ec47cacc8c"). InnerVolumeSpecName "kube-api-access-6pd9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:53:47 crc kubenswrapper[5036]: I0110 16:53:47.916369 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62a5215f-b9ce-414c-8b8e-80ec47cacc8c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "62a5215f-b9ce-414c-8b8e-80ec47cacc8c" (UID: "62a5215f-b9ce-414c-8b8e-80ec47cacc8c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 16:53:47 crc kubenswrapper[5036]: I0110 16:53:47.957528 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62a5215f-b9ce-414c-8b8e-80ec47cacc8c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 16:53:47 crc kubenswrapper[5036]: I0110 16:53:47.957570 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pd9v\" (UniqueName: \"kubernetes.io/projected/62a5215f-b9ce-414c-8b8e-80ec47cacc8c-kube-api-access-6pd9v\") on node \"crc\" DevicePath \"\"" Jan 10 16:53:47 crc kubenswrapper[5036]: I0110 16:53:47.957586 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62a5215f-b9ce-414c-8b8e-80ec47cacc8c-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 16:53:48 crc kubenswrapper[5036]: I0110 16:53:48.043156 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-qpfq8"] Jan 10 16:53:48 crc kubenswrapper[5036]: I0110 16:53:48.055515 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-qpfq8"] Jan 10 16:53:48 crc kubenswrapper[5036]: I0110 16:53:48.530163 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d623293c-52c5-4236-9a80-1ac9af4517d4" path="/var/lib/kubelet/pods/d623293c-52c5-4236-9a80-1ac9af4517d4/volumes" Jan 10 16:53:48 crc kubenswrapper[5036]: I0110 16:53:48.699582 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmv5n" Jan 10 16:53:48 crc kubenswrapper[5036]: I0110 16:53:48.731651 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wmv5n"] Jan 10 16:53:48 crc kubenswrapper[5036]: I0110 16:53:48.738540 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wmv5n"] Jan 10 16:53:50 crc kubenswrapper[5036]: I0110 16:53:50.545953 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62a5215f-b9ce-414c-8b8e-80ec47cacc8c" path="/var/lib/kubelet/pods/62a5215f-b9ce-414c-8b8e-80ec47cacc8c/volumes" Jan 10 16:54:03 crc kubenswrapper[5036]: I0110 16:54:03.027445 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-lj72s"] Jan 10 16:54:03 crc kubenswrapper[5036]: I0110 16:54:03.035277 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-lj72s"] Jan 10 16:54:04 crc kubenswrapper[5036]: I0110 16:54:04.517033 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09a8e315-dd60-47a9-b03c-0897b6f21b3d" path="/var/lib/kubelet/pods/09a8e315-dd60-47a9-b03c-0897b6f21b3d/volumes" Jan 10 16:54:10 crc kubenswrapper[5036]: I0110 16:54:10.896498 5036 generic.go:334] "Generic (PLEG): container finished" podID="3978931b-044e-4415-9e0b-5f16130dccc6" containerID="f24e224e180ff8eb956a58588ad6c4458e422e67846dbcee935971b1749e185f" exitCode=0 Jan 10 16:54:10 crc kubenswrapper[5036]: I0110 16:54:10.896590 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hf724" event={"ID":"3978931b-044e-4415-9e0b-5f16130dccc6","Type":"ContainerDied","Data":"f24e224e180ff8eb956a58588ad6c4458e422e67846dbcee935971b1749e185f"} Jan 10 16:54:12 crc kubenswrapper[5036]: I0110 16:54:12.349010 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hf724" Jan 10 16:54:12 crc kubenswrapper[5036]: I0110 16:54:12.425276 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgqpg\" (UniqueName: \"kubernetes.io/projected/3978931b-044e-4415-9e0b-5f16130dccc6-kube-api-access-bgqpg\") pod \"3978931b-044e-4415-9e0b-5f16130dccc6\" (UID: \"3978931b-044e-4415-9e0b-5f16130dccc6\") " Jan 10 16:54:12 crc kubenswrapper[5036]: I0110 16:54:12.425382 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3978931b-044e-4415-9e0b-5f16130dccc6-ssh-key-openstack-edpm-ipam\") pod \"3978931b-044e-4415-9e0b-5f16130dccc6\" (UID: \"3978931b-044e-4415-9e0b-5f16130dccc6\") " Jan 10 16:54:12 crc kubenswrapper[5036]: I0110 16:54:12.425514 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3978931b-044e-4415-9e0b-5f16130dccc6-inventory\") pod \"3978931b-044e-4415-9e0b-5f16130dccc6\" (UID: \"3978931b-044e-4415-9e0b-5f16130dccc6\") " Jan 10 16:54:12 crc kubenswrapper[5036]: I0110 16:54:12.430866 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3978931b-044e-4415-9e0b-5f16130dccc6-kube-api-access-bgqpg" (OuterVolumeSpecName: "kube-api-access-bgqpg") pod "3978931b-044e-4415-9e0b-5f16130dccc6" (UID: "3978931b-044e-4415-9e0b-5f16130dccc6"). InnerVolumeSpecName "kube-api-access-bgqpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:54:12 crc kubenswrapper[5036]: I0110 16:54:12.449883 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3978931b-044e-4415-9e0b-5f16130dccc6-inventory" (OuterVolumeSpecName: "inventory") pod "3978931b-044e-4415-9e0b-5f16130dccc6" (UID: "3978931b-044e-4415-9e0b-5f16130dccc6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:54:12 crc kubenswrapper[5036]: I0110 16:54:12.452373 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3978931b-044e-4415-9e0b-5f16130dccc6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3978931b-044e-4415-9e0b-5f16130dccc6" (UID: "3978931b-044e-4415-9e0b-5f16130dccc6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:54:12 crc kubenswrapper[5036]: I0110 16:54:12.528203 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgqpg\" (UniqueName: \"kubernetes.io/projected/3978931b-044e-4415-9e0b-5f16130dccc6-kube-api-access-bgqpg\") on node \"crc\" DevicePath \"\"" Jan 10 16:54:12 crc kubenswrapper[5036]: I0110 16:54:12.528236 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3978931b-044e-4415-9e0b-5f16130dccc6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 16:54:12 crc kubenswrapper[5036]: I0110 16:54:12.528247 5036 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3978931b-044e-4415-9e0b-5f16130dccc6-inventory\") on node \"crc\" DevicePath \"\"" Jan 10 16:54:12 crc kubenswrapper[5036]: I0110 16:54:12.920832 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hf724" event={"ID":"3978931b-044e-4415-9e0b-5f16130dccc6","Type":"ContainerDied","Data":"ebafc1c870719acceb3084ad1bd44dbe7cd282a828307962294606d0bd5df519"} Jan 10 16:54:12 crc kubenswrapper[5036]: I0110 16:54:12.920913 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hf724" Jan 10 16:54:12 crc kubenswrapper[5036]: I0110 16:54:12.920913 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebafc1c870719acceb3084ad1bd44dbe7cd282a828307962294606d0bd5df519" Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.005851 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l"] Jan 10 16:54:13 crc kubenswrapper[5036]: E0110 16:54:13.006339 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3978931b-044e-4415-9e0b-5f16130dccc6" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.006370 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="3978931b-044e-4415-9e0b-5f16130dccc6" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 10 16:54:13 crc kubenswrapper[5036]: E0110 16:54:13.006388 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62a5215f-b9ce-414c-8b8e-80ec47cacc8c" containerName="registry-server" Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.006421 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a5215f-b9ce-414c-8b8e-80ec47cacc8c" containerName="registry-server" Jan 10 16:54:13 crc kubenswrapper[5036]: E0110 16:54:13.006436 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62a5215f-b9ce-414c-8b8e-80ec47cacc8c" containerName="extract-utilities" Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.006452 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a5215f-b9ce-414c-8b8e-80ec47cacc8c" containerName="extract-utilities" Jan 10 16:54:13 crc kubenswrapper[5036]: E0110 16:54:13.006489 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62a5215f-b9ce-414c-8b8e-80ec47cacc8c" containerName="extract-content" Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.006500 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a5215f-b9ce-414c-8b8e-80ec47cacc8c" containerName="extract-content" Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.008153 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="62a5215f-b9ce-414c-8b8e-80ec47cacc8c" containerName="registry-server" Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.008196 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="3978931b-044e-4415-9e0b-5f16130dccc6" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.009213 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l" Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.012492 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.019261 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l"] Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.019641 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.019907 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.019935 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.040975 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjrvn\" (UniqueName: \"kubernetes.io/projected/45942a81-832d-42f1-b68d-57e92315759c-kube-api-access-hjrvn\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l\" (UID: \"45942a81-832d-42f1-b68d-57e92315759c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l" Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.041201 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/45942a81-832d-42f1-b68d-57e92315759c-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l\" (UID: \"45942a81-832d-42f1-b68d-57e92315759c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l" Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.041265 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45942a81-832d-42f1-b68d-57e92315759c-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l\" (UID: \"45942a81-832d-42f1-b68d-57e92315759c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l" Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.143397 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/45942a81-832d-42f1-b68d-57e92315759c-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l\" (UID: \"45942a81-832d-42f1-b68d-57e92315759c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l" Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.143862 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45942a81-832d-42f1-b68d-57e92315759c-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l\" (UID: \"45942a81-832d-42f1-b68d-57e92315759c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l" Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.144840 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjrvn\" (UniqueName: \"kubernetes.io/projected/45942a81-832d-42f1-b68d-57e92315759c-kube-api-access-hjrvn\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l\" (UID: \"45942a81-832d-42f1-b68d-57e92315759c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l" Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.148153 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/45942a81-832d-42f1-b68d-57e92315759c-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l\" (UID: \"45942a81-832d-42f1-b68d-57e92315759c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l" Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.155241 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45942a81-832d-42f1-b68d-57e92315759c-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l\" (UID: \"45942a81-832d-42f1-b68d-57e92315759c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l" Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.161205 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjrvn\" (UniqueName: \"kubernetes.io/projected/45942a81-832d-42f1-b68d-57e92315759c-kube-api-access-hjrvn\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l\" (UID: \"45942a81-832d-42f1-b68d-57e92315759c\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l" Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.346649 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l" Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.900092 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l"] Jan 10 16:54:13 crc kubenswrapper[5036]: W0110 16:54:13.912199 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45942a81_832d_42f1_b68d_57e92315759c.slice/crio-fe1b1b7a753fc7e27cf3bab251a7f557531ced5179bef577afbced5ab2fc53cd WatchSource:0}: Error finding container fe1b1b7a753fc7e27cf3bab251a7f557531ced5179bef577afbced5ab2fc53cd: Status 404 returned error can't find the container with id fe1b1b7a753fc7e27cf3bab251a7f557531ced5179bef577afbced5ab2fc53cd Jan 10 16:54:13 crc kubenswrapper[5036]: I0110 16:54:13.934137 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l" event={"ID":"45942a81-832d-42f1-b68d-57e92315759c","Type":"ContainerStarted","Data":"fe1b1b7a753fc7e27cf3bab251a7f557531ced5179bef577afbced5ab2fc53cd"} Jan 10 16:54:14 crc kubenswrapper[5036]: I0110 16:54:14.956182 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l" event={"ID":"45942a81-832d-42f1-b68d-57e92315759c","Type":"ContainerStarted","Data":"eac7769a5f0268847dd1682479c36236728e90ac1aff3a319efee4508b56dbaa"} Jan 10 16:54:20 crc kubenswrapper[5036]: I0110 16:54:20.016458 5036 generic.go:334] "Generic (PLEG): container finished" podID="45942a81-832d-42f1-b68d-57e92315759c" containerID="eac7769a5f0268847dd1682479c36236728e90ac1aff3a319efee4508b56dbaa" exitCode=0 Jan 10 16:54:20 crc kubenswrapper[5036]: I0110 16:54:20.016577 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l" event={"ID":"45942a81-832d-42f1-b68d-57e92315759c","Type":"ContainerDied","Data":"eac7769a5f0268847dd1682479c36236728e90ac1aff3a319efee4508b56dbaa"} Jan 10 16:54:21 crc kubenswrapper[5036]: I0110 16:54:21.023056 5036 scope.go:117] "RemoveContainer" containerID="79c3e8969c9f9b65d130f2990e182a31d8a891e52d1cfc216eca6a03eec32628" Jan 10 16:54:21 crc kubenswrapper[5036]: I0110 16:54:21.046941 5036 scope.go:117] "RemoveContainer" containerID="a9b0fcb41924f931ce7ff7ff245f38c13cdd3e86e0a6f8203641370276edb5bc" Jan 10 16:54:21 crc kubenswrapper[5036]: I0110 16:54:21.099918 5036 scope.go:117] "RemoveContainer" containerID="9a4748ce01c603963f1cf607735e45a8b5c0430ba32a933bbc66d44ecda48ff5" Jan 10 16:54:21 crc kubenswrapper[5036]: I0110 16:54:21.144625 5036 scope.go:117] "RemoveContainer" containerID="c008e0fa37fd0ebe2bd5f950ad76e574f7be09aea9ba2c7447cb36a55bec9f8c" Jan 10 16:54:21 crc kubenswrapper[5036]: I0110 16:54:21.179043 5036 scope.go:117] "RemoveContainer" containerID="51e7cb40a9e19b63c2f4d863ea9b6db94bb0c644585189fdf3fb938b9224b235" Jan 10 16:54:21 crc kubenswrapper[5036]: I0110 16:54:21.217801 5036 scope.go:117] "RemoveContainer" containerID="0c48c8c0e73c0ae93f5400b3f232a9ca88a7f2457336f8f5922b0151b1bc32fc" Jan 10 16:54:21 crc kubenswrapper[5036]: I0110 16:54:21.268437 5036 scope.go:117] "RemoveContainer" containerID="e5444e16dd10b769b23e1520919e4d90fbef589ff09bdcffc5a4fd3a393bf19c" Jan 10 16:54:21 crc kubenswrapper[5036]: I0110 16:54:21.289018 5036 scope.go:117] "RemoveContainer" containerID="9d1d6f44d352d617ec142420b42426e14d335a6452b817af20cca30f4c6255b4" Jan 10 16:54:21 crc kubenswrapper[5036]: I0110 16:54:21.331014 5036 scope.go:117] "RemoveContainer" containerID="8d47ab53ff73ece2438b5a690dec63225c2d89b69b409be461593a11cd0e4a87" Jan 10 16:54:21 crc kubenswrapper[5036]: I0110 16:54:21.345642 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l" Jan 10 16:54:21 crc kubenswrapper[5036]: I0110 16:54:21.409124 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45942a81-832d-42f1-b68d-57e92315759c-inventory\") pod \"45942a81-832d-42f1-b68d-57e92315759c\" (UID: \"45942a81-832d-42f1-b68d-57e92315759c\") " Jan 10 16:54:21 crc kubenswrapper[5036]: I0110 16:54:21.409189 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/45942a81-832d-42f1-b68d-57e92315759c-ssh-key-openstack-edpm-ipam\") pod \"45942a81-832d-42f1-b68d-57e92315759c\" (UID: \"45942a81-832d-42f1-b68d-57e92315759c\") " Jan 10 16:54:21 crc kubenswrapper[5036]: I0110 16:54:21.409329 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjrvn\" (UniqueName: \"kubernetes.io/projected/45942a81-832d-42f1-b68d-57e92315759c-kube-api-access-hjrvn\") pod \"45942a81-832d-42f1-b68d-57e92315759c\" (UID: \"45942a81-832d-42f1-b68d-57e92315759c\") " Jan 10 16:54:21 crc kubenswrapper[5036]: I0110 16:54:21.415759 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45942a81-832d-42f1-b68d-57e92315759c-kube-api-access-hjrvn" (OuterVolumeSpecName: "kube-api-access-hjrvn") pod "45942a81-832d-42f1-b68d-57e92315759c" (UID: "45942a81-832d-42f1-b68d-57e92315759c"). InnerVolumeSpecName "kube-api-access-hjrvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:54:21 crc kubenswrapper[5036]: I0110 16:54:21.433835 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45942a81-832d-42f1-b68d-57e92315759c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "45942a81-832d-42f1-b68d-57e92315759c" (UID: "45942a81-832d-42f1-b68d-57e92315759c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:54:21 crc kubenswrapper[5036]: I0110 16:54:21.437381 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45942a81-832d-42f1-b68d-57e92315759c-inventory" (OuterVolumeSpecName: "inventory") pod "45942a81-832d-42f1-b68d-57e92315759c" (UID: "45942a81-832d-42f1-b68d-57e92315759c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:54:21 crc kubenswrapper[5036]: I0110 16:54:21.510942 5036 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45942a81-832d-42f1-b68d-57e92315759c-inventory\") on node \"crc\" DevicePath \"\"" Jan 10 16:54:21 crc kubenswrapper[5036]: I0110 16:54:21.510980 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/45942a81-832d-42f1-b68d-57e92315759c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 16:54:21 crc kubenswrapper[5036]: I0110 16:54:21.510992 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjrvn\" (UniqueName: \"kubernetes.io/projected/45942a81-832d-42f1-b68d-57e92315759c-kube-api-access-hjrvn\") on node \"crc\" DevicePath \"\"" Jan 10 16:54:22 crc kubenswrapper[5036]: I0110 16:54:22.038653 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l" event={"ID":"45942a81-832d-42f1-b68d-57e92315759c","Type":"ContainerDied","Data":"fe1b1b7a753fc7e27cf3bab251a7f557531ced5179bef577afbced5ab2fc53cd"} Jan 10 16:54:22 crc kubenswrapper[5036]: I0110 16:54:22.038778 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe1b1b7a753fc7e27cf3bab251a7f557531ced5179bef577afbced5ab2fc53cd" Jan 10 16:54:22 crc kubenswrapper[5036]: I0110 16:54:22.038711 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l" Jan 10 16:54:22 crc kubenswrapper[5036]: I0110 16:54:22.112748 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf"] Jan 10 16:54:22 crc kubenswrapper[5036]: E0110 16:54:22.113217 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45942a81-832d-42f1-b68d-57e92315759c" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 10 16:54:22 crc kubenswrapper[5036]: I0110 16:54:22.113240 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="45942a81-832d-42f1-b68d-57e92315759c" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 10 16:54:22 crc kubenswrapper[5036]: I0110 16:54:22.113434 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="45942a81-832d-42f1-b68d-57e92315759c" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 10 16:54:22 crc kubenswrapper[5036]: I0110 16:54:22.114137 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf" Jan 10 16:54:22 crc kubenswrapper[5036]: I0110 16:54:22.121510 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf"] Jan 10 16:54:22 crc kubenswrapper[5036]: I0110 16:54:22.122416 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bdfee06-51dd-434d-b315-99e079ec9895-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf\" (UID: \"1bdfee06-51dd-434d-b315-99e079ec9895\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf" Jan 10 16:54:22 crc kubenswrapper[5036]: I0110 16:54:22.122474 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bdfee06-51dd-434d-b315-99e079ec9895-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf\" (UID: \"1bdfee06-51dd-434d-b315-99e079ec9895\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf" Jan 10 16:54:22 crc kubenswrapper[5036]: I0110 16:54:22.122548 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnhk6\" (UniqueName: \"kubernetes.io/projected/1bdfee06-51dd-434d-b315-99e079ec9895-kube-api-access-rnhk6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf\" (UID: \"1bdfee06-51dd-434d-b315-99e079ec9895\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf" Jan 10 16:54:22 crc kubenswrapper[5036]: I0110 16:54:22.144184 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 16:54:22 crc kubenswrapper[5036]: I0110 16:54:22.144184 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 16:54:22 crc kubenswrapper[5036]: I0110 16:54:22.144370 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 16:54:22 crc kubenswrapper[5036]: I0110 16:54:22.144400 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 16:54:22 crc kubenswrapper[5036]: I0110 16:54:22.223799 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bdfee06-51dd-434d-b315-99e079ec9895-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf\" (UID: \"1bdfee06-51dd-434d-b315-99e079ec9895\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf" Jan 10 16:54:22 crc kubenswrapper[5036]: I0110 16:54:22.223882 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bdfee06-51dd-434d-b315-99e079ec9895-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf\" (UID: \"1bdfee06-51dd-434d-b315-99e079ec9895\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf" Jan 10 16:54:22 crc kubenswrapper[5036]: I0110 16:54:22.223961 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnhk6\" (UniqueName: \"kubernetes.io/projected/1bdfee06-51dd-434d-b315-99e079ec9895-kube-api-access-rnhk6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf\" (UID: \"1bdfee06-51dd-434d-b315-99e079ec9895\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf" Jan 10 16:54:22 crc kubenswrapper[5036]: I0110 16:54:22.231405 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bdfee06-51dd-434d-b315-99e079ec9895-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf\" (UID: \"1bdfee06-51dd-434d-b315-99e079ec9895\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf" Jan 10 16:54:22 crc kubenswrapper[5036]: I0110 16:54:22.235193 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bdfee06-51dd-434d-b315-99e079ec9895-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf\" (UID: \"1bdfee06-51dd-434d-b315-99e079ec9895\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf" Jan 10 16:54:22 crc kubenswrapper[5036]: I0110 16:54:22.240130 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnhk6\" (UniqueName: \"kubernetes.io/projected/1bdfee06-51dd-434d-b315-99e079ec9895-kube-api-access-rnhk6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf\" (UID: \"1bdfee06-51dd-434d-b315-99e079ec9895\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf" Jan 10 16:54:22 crc kubenswrapper[5036]: I0110 16:54:22.464201 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf" Jan 10 16:54:22 crc kubenswrapper[5036]: I0110 16:54:22.987420 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf"] Jan 10 16:54:23 crc kubenswrapper[5036]: I0110 16:54:23.049120 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf" event={"ID":"1bdfee06-51dd-434d-b315-99e079ec9895","Type":"ContainerStarted","Data":"99533e68f50e15e3dee288d27ff6ec2285dccf24cefbb5e1c27f79a268d6508c"} Jan 10 16:54:23 crc kubenswrapper[5036]: I0110 16:54:23.055445 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-c9rbf"] Jan 10 16:54:23 crc kubenswrapper[5036]: I0110 16:54:23.062713 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-c9rbf"] Jan 10 16:54:23 crc kubenswrapper[5036]: I0110 16:54:23.070394 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-wwspw"] Jan 10 16:54:23 crc kubenswrapper[5036]: I0110 16:54:23.077274 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-wwspw"] Jan 10 16:54:24 crc kubenswrapper[5036]: I0110 16:54:24.039804 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-bks65"] Jan 10 16:54:24 crc kubenswrapper[5036]: I0110 16:54:24.049004 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-cngfk"] Jan 10 16:54:24 crc kubenswrapper[5036]: I0110 16:54:24.067858 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-cngfk"] Jan 10 16:54:24 crc kubenswrapper[5036]: I0110 16:54:24.071898 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf" event={"ID":"1bdfee06-51dd-434d-b315-99e079ec9895","Type":"ContainerStarted","Data":"955496c32694a10b2d348bd4a8f1c73c530e4f25b291066855c51fdccd8fc08f"} Jan 10 16:54:24 crc kubenswrapper[5036]: I0110 16:54:24.075286 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-bks65"] Jan 10 16:54:24 crc kubenswrapper[5036]: I0110 16:54:24.088302 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf" podStartSLOduration=1.6826396689999998 podStartE2EDuration="2.08828394s" podCreationTimestamp="2026-01-10 16:54:22 +0000 UTC" firstStartedPulling="2026-01-10 16:54:22.990095428 +0000 UTC m=+1584.860330922" lastFinishedPulling="2026-01-10 16:54:23.395739699 +0000 UTC m=+1585.265975193" observedRunningTime="2026-01-10 16:54:24.082361211 +0000 UTC m=+1585.952596715" watchObservedRunningTime="2026-01-10 16:54:24.08828394 +0000 UTC m=+1585.958519434" Jan 10 16:54:24 crc kubenswrapper[5036]: I0110 16:54:24.520456 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c" path="/var/lib/kubelet/pods/0fe0ff1c-0f37-4d2a-a66e-5fd5412c676c/volumes" Jan 10 16:54:24 crc kubenswrapper[5036]: I0110 16:54:24.521435 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402" path="/var/lib/kubelet/pods/2cec5a6e-1a0d-45e9-a4f3-5e8aedc3d402/volumes" Jan 10 16:54:24 crc kubenswrapper[5036]: I0110 16:54:24.522362 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b4df096-e6ee-47df-a4ae-035aeade27a6" path="/var/lib/kubelet/pods/7b4df096-e6ee-47df-a4ae-035aeade27a6/volumes" Jan 10 16:54:24 crc kubenswrapper[5036]: I0110 16:54:24.523807 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1899d96-c3b2-415c-b1fd-7c2847da4370" path="/var/lib/kubelet/pods/b1899d96-c3b2-415c-b1fd-7c2847da4370/volumes" Jan 10 16:54:45 crc kubenswrapper[5036]: I0110 16:54:45.039899 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-j9crs"] Jan 10 16:54:45 crc kubenswrapper[5036]: I0110 16:54:45.048151 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-j9crs"] Jan 10 16:54:46 crc kubenswrapper[5036]: I0110 16:54:46.523179 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fe6dd46-603d-4595-ad27-32f98623fbcc" path="/var/lib/kubelet/pods/6fe6dd46-603d-4595-ad27-32f98623fbcc/volumes" Jan 10 16:54:55 crc kubenswrapper[5036]: I0110 16:54:55.904646 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 16:54:55 crc kubenswrapper[5036]: I0110 16:54:55.905286 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 16:55:14 crc kubenswrapper[5036]: I0110 16:55:14.029361 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-d24s2"] Jan 10 16:55:14 crc kubenswrapper[5036]: I0110 16:55:14.058259 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-d6g85"] Jan 10 16:55:14 crc kubenswrapper[5036]: I0110 16:55:14.067889 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-b3eb-account-create-update-bln2k"] Jan 10 16:55:14 crc kubenswrapper[5036]: I0110 16:55:14.076465 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-d24s2"] Jan 10 16:55:14 crc kubenswrapper[5036]: I0110 16:55:14.084909 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-d6g85"] Jan 10 16:55:14 crc kubenswrapper[5036]: I0110 16:55:14.091795 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-b3eb-account-create-update-bln2k"] Jan 10 16:55:14 crc kubenswrapper[5036]: I0110 16:55:14.519991 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4589fdc9-748c-41e7-ba5b-493750149d60" path="/var/lib/kubelet/pods/4589fdc9-748c-41e7-ba5b-493750149d60/volumes" Jan 10 16:55:14 crc kubenswrapper[5036]: I0110 16:55:14.520968 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="644223c9-410d-4ce9-b1a7-d6137d46f3cf" path="/var/lib/kubelet/pods/644223c9-410d-4ce9-b1a7-d6137d46f3cf/volumes" Jan 10 16:55:14 crc kubenswrapper[5036]: I0110 16:55:14.521970 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f52d4419-3cc2-47fb-8a1f-5b086a2660a9" path="/var/lib/kubelet/pods/f52d4419-3cc2-47fb-8a1f-5b086a2660a9/volumes" Jan 10 16:55:15 crc kubenswrapper[5036]: I0110 16:55:15.036865 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-6758-account-create-update-mbg5f"] Jan 10 16:55:15 crc kubenswrapper[5036]: I0110 16:55:15.045569 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-1e9d-account-create-update-xf4tj"] Jan 10 16:55:15 crc kubenswrapper[5036]: I0110 16:55:15.053058 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-6758-account-create-update-mbg5f"] Jan 10 16:55:15 crc kubenswrapper[5036]: I0110 16:55:15.062009 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-1e9d-account-create-update-xf4tj"] Jan 10 16:55:15 crc kubenswrapper[5036]: I0110 16:55:15.069827 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-sz6zr"] Jan 10 16:55:15 crc kubenswrapper[5036]: I0110 16:55:15.076769 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-sz6zr"] Jan 10 16:55:16 crc kubenswrapper[5036]: I0110 16:55:16.530371 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5135e7f-1fec-4960-ab32-eeb7901e1a4d" path="/var/lib/kubelet/pods/d5135e7f-1fec-4960-ab32-eeb7901e1a4d/volumes" Jan 10 16:55:16 crc kubenswrapper[5036]: I0110 16:55:16.532306 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c" path="/var/lib/kubelet/pods/e30386ec-f3c1-4e4e-a7d7-e1f1d44b8f8c/volumes" Jan 10 16:55:16 crc kubenswrapper[5036]: I0110 16:55:16.533365 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56" path="/var/lib/kubelet/pods/e3f80539-7ec4-4cb0-ae3e-ecf8fd96ab56/volumes" Jan 10 16:55:17 crc kubenswrapper[5036]: I0110 16:55:17.540453 5036 generic.go:334] "Generic (PLEG): container finished" podID="1bdfee06-51dd-434d-b315-99e079ec9895" containerID="955496c32694a10b2d348bd4a8f1c73c530e4f25b291066855c51fdccd8fc08f" exitCode=0 Jan 10 16:55:17 crc kubenswrapper[5036]: I0110 16:55:17.540539 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf" event={"ID":"1bdfee06-51dd-434d-b315-99e079ec9895","Type":"ContainerDied","Data":"955496c32694a10b2d348bd4a8f1c73c530e4f25b291066855c51fdccd8fc08f"} Jan 10 16:55:18 crc kubenswrapper[5036]: I0110 16:55:18.975718 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf" Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.077194 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bdfee06-51dd-434d-b315-99e079ec9895-inventory\") pod \"1bdfee06-51dd-434d-b315-99e079ec9895\" (UID: \"1bdfee06-51dd-434d-b315-99e079ec9895\") " Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.077780 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bdfee06-51dd-434d-b315-99e079ec9895-ssh-key-openstack-edpm-ipam\") pod \"1bdfee06-51dd-434d-b315-99e079ec9895\" (UID: \"1bdfee06-51dd-434d-b315-99e079ec9895\") " Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.077895 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnhk6\" (UniqueName: \"kubernetes.io/projected/1bdfee06-51dd-434d-b315-99e079ec9895-kube-api-access-rnhk6\") pod \"1bdfee06-51dd-434d-b315-99e079ec9895\" (UID: \"1bdfee06-51dd-434d-b315-99e079ec9895\") " Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.082997 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bdfee06-51dd-434d-b315-99e079ec9895-kube-api-access-rnhk6" (OuterVolumeSpecName: "kube-api-access-rnhk6") pod "1bdfee06-51dd-434d-b315-99e079ec9895" (UID: "1bdfee06-51dd-434d-b315-99e079ec9895"). InnerVolumeSpecName "kube-api-access-rnhk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:55:19 crc kubenswrapper[5036]: E0110 16:55:19.098574 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bdfee06-51dd-434d-b315-99e079ec9895-ssh-key-openstack-edpm-ipam podName:1bdfee06-51dd-434d-b315-99e079ec9895 nodeName:}" failed. No retries permitted until 2026-01-10 16:55:19.598549623 +0000 UTC m=+1641.468785117 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key-openstack-edpm-ipam" (UniqueName: "kubernetes.io/secret/1bdfee06-51dd-434d-b315-99e079ec9895-ssh-key-openstack-edpm-ipam") pod "1bdfee06-51dd-434d-b315-99e079ec9895" (UID: "1bdfee06-51dd-434d-b315-99e079ec9895") : error deleting /var/lib/kubelet/pods/1bdfee06-51dd-434d-b315-99e079ec9895/volume-subpaths: remove /var/lib/kubelet/pods/1bdfee06-51dd-434d-b315-99e079ec9895/volume-subpaths: no such file or directory Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.100714 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bdfee06-51dd-434d-b315-99e079ec9895-inventory" (OuterVolumeSpecName: "inventory") pod "1bdfee06-51dd-434d-b315-99e079ec9895" (UID: "1bdfee06-51dd-434d-b315-99e079ec9895"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.179322 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnhk6\" (UniqueName: \"kubernetes.io/projected/1bdfee06-51dd-434d-b315-99e079ec9895-kube-api-access-rnhk6\") on node \"crc\" DevicePath \"\"" Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.179352 5036 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1bdfee06-51dd-434d-b315-99e079ec9895-inventory\") on node \"crc\" DevicePath \"\"" Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.565502 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf" event={"ID":"1bdfee06-51dd-434d-b315-99e079ec9895","Type":"ContainerDied","Data":"99533e68f50e15e3dee288d27ff6ec2285dccf24cefbb5e1c27f79a268d6508c"} Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.565559 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99533e68f50e15e3dee288d27ff6ec2285dccf24cefbb5e1c27f79a268d6508c" Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.565642 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf" Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.690520 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bdfee06-51dd-434d-b315-99e079ec9895-ssh-key-openstack-edpm-ipam\") pod \"1bdfee06-51dd-434d-b315-99e079ec9895\" (UID: \"1bdfee06-51dd-434d-b315-99e079ec9895\") " Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.712843 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bdfee06-51dd-434d-b315-99e079ec9895-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1bdfee06-51dd-434d-b315-99e079ec9895" (UID: "1bdfee06-51dd-434d-b315-99e079ec9895"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.717379 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cxjg4"] Jan 10 16:55:19 crc kubenswrapper[5036]: E0110 16:55:19.717892 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bdfee06-51dd-434d-b315-99e079ec9895" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.717915 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bdfee06-51dd-434d-b315-99e079ec9895" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.718120 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bdfee06-51dd-434d-b315-99e079ec9895" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.718808 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cxjg4" Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.748379 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cxjg4"] Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.795724 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1801856c-1172-415e-b44c-84437ef2c240-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cxjg4\" (UID: \"1801856c-1172-415e-b44c-84437ef2c240\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxjg4" Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.795782 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1801856c-1172-415e-b44c-84437ef2c240-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cxjg4\" (UID: \"1801856c-1172-415e-b44c-84437ef2c240\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxjg4" Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.795842 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxf65\" (UniqueName: \"kubernetes.io/projected/1801856c-1172-415e-b44c-84437ef2c240-kube-api-access-lxf65\") pod \"ssh-known-hosts-edpm-deployment-cxjg4\" (UID: \"1801856c-1172-415e-b44c-84437ef2c240\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxjg4" Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.795925 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1bdfee06-51dd-434d-b315-99e079ec9895-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.897177 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxf65\" (UniqueName: \"kubernetes.io/projected/1801856c-1172-415e-b44c-84437ef2c240-kube-api-access-lxf65\") pod \"ssh-known-hosts-edpm-deployment-cxjg4\" (UID: \"1801856c-1172-415e-b44c-84437ef2c240\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxjg4" Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.897312 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1801856c-1172-415e-b44c-84437ef2c240-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cxjg4\" (UID: \"1801856c-1172-415e-b44c-84437ef2c240\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxjg4" Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.897349 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1801856c-1172-415e-b44c-84437ef2c240-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cxjg4\" (UID: \"1801856c-1172-415e-b44c-84437ef2c240\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxjg4" Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.902319 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1801856c-1172-415e-b44c-84437ef2c240-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cxjg4\" (UID: \"1801856c-1172-415e-b44c-84437ef2c240\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxjg4" Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.903528 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1801856c-1172-415e-b44c-84437ef2c240-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cxjg4\" (UID: \"1801856c-1172-415e-b44c-84437ef2c240\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxjg4" Jan 10 16:55:19 crc kubenswrapper[5036]: I0110 16:55:19.912341 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxf65\" (UniqueName: \"kubernetes.io/projected/1801856c-1172-415e-b44c-84437ef2c240-kube-api-access-lxf65\") pod \"ssh-known-hosts-edpm-deployment-cxjg4\" (UID: \"1801856c-1172-415e-b44c-84437ef2c240\") " pod="openstack/ssh-known-hosts-edpm-deployment-cxjg4" Jan 10 16:55:20 crc kubenswrapper[5036]: I0110 16:55:20.080960 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cxjg4" Jan 10 16:55:20 crc kubenswrapper[5036]: I0110 16:55:20.602865 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cxjg4"] Jan 10 16:55:21 crc kubenswrapper[5036]: I0110 16:55:21.506269 5036 scope.go:117] "RemoveContainer" containerID="c3accb81973c6059402e04d949e82809211eaa8ac93cdf0a430d82051a1859c4" Jan 10 16:55:21 crc kubenswrapper[5036]: I0110 16:55:21.529613 5036 scope.go:117] "RemoveContainer" containerID="df635eed0a19abd68f98e6e2dd3b7d65c16d69333e7adc57a6235c75b259d1be" Jan 10 16:55:21 crc kubenswrapper[5036]: I0110 16:55:21.582281 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cxjg4" event={"ID":"1801856c-1172-415e-b44c-84437ef2c240","Type":"ContainerStarted","Data":"67f3b7fd24ff2a03470643cb4651ff7f9c5e5867383284789883ee79833bcec7"} Jan 10 16:55:21 crc kubenswrapper[5036]: I0110 16:55:21.582320 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cxjg4" event={"ID":"1801856c-1172-415e-b44c-84437ef2c240","Type":"ContainerStarted","Data":"2324e98578d42a675aa97917bda5b007a7bd777745a26b8a673a293d1b7cd722"} Jan 10 16:55:21 crc kubenswrapper[5036]: I0110 16:55:21.600547 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-cxjg4" podStartSLOduration=2.070682734 podStartE2EDuration="2.600524016s" podCreationTimestamp="2026-01-10 16:55:19 +0000 UTC" firstStartedPulling="2026-01-10 16:55:20.607119201 +0000 UTC m=+1642.477354705" lastFinishedPulling="2026-01-10 16:55:21.136960473 +0000 UTC m=+1643.007195987" observedRunningTime="2026-01-10 16:55:21.599770524 +0000 UTC m=+1643.470006028" watchObservedRunningTime="2026-01-10 16:55:21.600524016 +0000 UTC m=+1643.470759520" Jan 10 16:55:21 crc kubenswrapper[5036]: I0110 16:55:21.630202 5036 scope.go:117] "RemoveContainer" containerID="aaa9e8a5cd5832c9ece96199964f96f5f7f636a5e4e2aeac5041907f8d4864c1" Jan 10 16:55:21 crc kubenswrapper[5036]: I0110 16:55:21.665346 5036 scope.go:117] "RemoveContainer" containerID="602aabefb458eef6fc29fddff81aba1cabbfa80cdd04d75fa7163fbef6f386be" Jan 10 16:55:21 crc kubenswrapper[5036]: I0110 16:55:21.681884 5036 scope.go:117] "RemoveContainer" containerID="ad1572e307019edae219440a682d075861e0bb46a1d5f9b75a8dd28efff7b578" Jan 10 16:55:21 crc kubenswrapper[5036]: I0110 16:55:21.729954 5036 scope.go:117] "RemoveContainer" containerID="1ec53884f39b464cdcdebf5e1b855b078e887cebdd284c2afa448443a542fa99" Jan 10 16:55:21 crc kubenswrapper[5036]: I0110 16:55:21.755603 5036 scope.go:117] "RemoveContainer" containerID="1b6fe53b54157c6721584141cb346327060b09cff0ddbcf42cf33cb176320ec8" Jan 10 16:55:21 crc kubenswrapper[5036]: I0110 16:55:21.786376 5036 scope.go:117] "RemoveContainer" containerID="5aa7d4af41e3913267cb29417492d0762ad6c746ab2547a867c7a0c593e6b8a1" Jan 10 16:55:21 crc kubenswrapper[5036]: I0110 16:55:21.829109 5036 scope.go:117] "RemoveContainer" containerID="ee1ff43f87aafe6e6053d10753108168c66f458552bafa514e0c8a418b77ed78" Jan 10 16:55:21 crc kubenswrapper[5036]: I0110 16:55:21.856197 5036 scope.go:117] "RemoveContainer" containerID="c32f8ba1b80961763f2faa4b29618b0f7d2c21750aca2adb5f383af25c067818" Jan 10 16:55:21 crc kubenswrapper[5036]: I0110 16:55:21.881703 5036 scope.go:117] "RemoveContainer" containerID="cbfd17d699134a608165d5a7c768f8f1ef9c076a375f09a1733cc2038619dbbb" Jan 10 16:55:25 crc kubenswrapper[5036]: I0110 16:55:25.904486 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 16:55:25 crc kubenswrapper[5036]: I0110 16:55:25.904837 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 16:55:28 crc kubenswrapper[5036]: I0110 16:55:28.641706 5036 generic.go:334] "Generic (PLEG): container finished" podID="1801856c-1172-415e-b44c-84437ef2c240" containerID="67f3b7fd24ff2a03470643cb4651ff7f9c5e5867383284789883ee79833bcec7" exitCode=0 Jan 10 16:55:28 crc kubenswrapper[5036]: I0110 16:55:28.641764 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cxjg4" event={"ID":"1801856c-1172-415e-b44c-84437ef2c240","Type":"ContainerDied","Data":"67f3b7fd24ff2a03470643cb4651ff7f9c5e5867383284789883ee79833bcec7"} Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.422919 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cxjg4" Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.499721 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1801856c-1172-415e-b44c-84437ef2c240-inventory-0\") pod \"1801856c-1172-415e-b44c-84437ef2c240\" (UID: \"1801856c-1172-415e-b44c-84437ef2c240\") " Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.500030 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1801856c-1172-415e-b44c-84437ef2c240-ssh-key-openstack-edpm-ipam\") pod \"1801856c-1172-415e-b44c-84437ef2c240\" (UID: \"1801856c-1172-415e-b44c-84437ef2c240\") " Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.500170 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxf65\" (UniqueName: \"kubernetes.io/projected/1801856c-1172-415e-b44c-84437ef2c240-kube-api-access-lxf65\") pod \"1801856c-1172-415e-b44c-84437ef2c240\" (UID: \"1801856c-1172-415e-b44c-84437ef2c240\") " Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.507626 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1801856c-1172-415e-b44c-84437ef2c240-kube-api-access-lxf65" (OuterVolumeSpecName: "kube-api-access-lxf65") pod "1801856c-1172-415e-b44c-84437ef2c240" (UID: "1801856c-1172-415e-b44c-84437ef2c240"). InnerVolumeSpecName "kube-api-access-lxf65". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.539815 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1801856c-1172-415e-b44c-84437ef2c240-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "1801856c-1172-415e-b44c-84437ef2c240" (UID: "1801856c-1172-415e-b44c-84437ef2c240"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.541990 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1801856c-1172-415e-b44c-84437ef2c240-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1801856c-1172-415e-b44c-84437ef2c240" (UID: "1801856c-1172-415e-b44c-84437ef2c240"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.602790 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1801856c-1172-415e-b44c-84437ef2c240-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.602828 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxf65\" (UniqueName: \"kubernetes.io/projected/1801856c-1172-415e-b44c-84437ef2c240-kube-api-access-lxf65\") on node \"crc\" DevicePath \"\"" Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.602839 5036 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1801856c-1172-415e-b44c-84437ef2c240-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.668119 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cxjg4" event={"ID":"1801856c-1172-415e-b44c-84437ef2c240","Type":"ContainerDied","Data":"2324e98578d42a675aa97917bda5b007a7bd777745a26b8a673a293d1b7cd722"} Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.668479 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2324e98578d42a675aa97917bda5b007a7bd777745a26b8a673a293d1b7cd722" Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.668619 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cxjg4" Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.738997 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-vlgcl"] Jan 10 16:55:30 crc kubenswrapper[5036]: E0110 16:55:30.739416 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1801856c-1172-415e-b44c-84437ef2c240" containerName="ssh-known-hosts-edpm-deployment" Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.739435 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="1801856c-1172-415e-b44c-84437ef2c240" containerName="ssh-known-hosts-edpm-deployment" Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.739615 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="1801856c-1172-415e-b44c-84437ef2c240" containerName="ssh-known-hosts-edpm-deployment" Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.740190 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vlgcl" Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.743886 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.744040 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.745356 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.747097 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.751708 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-vlgcl"] Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.908142 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27f5540a-27a6-4594-b86c-4118fff1f8da-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vlgcl\" (UID: \"27f5540a-27a6-4594-b86c-4118fff1f8da\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vlgcl" Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.908198 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxh48\" (UniqueName: \"kubernetes.io/projected/27f5540a-27a6-4594-b86c-4118fff1f8da-kube-api-access-rxh48\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vlgcl\" (UID: \"27f5540a-27a6-4594-b86c-4118fff1f8da\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vlgcl" Jan 10 16:55:30 crc kubenswrapper[5036]: I0110 16:55:30.908267 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/27f5540a-27a6-4594-b86c-4118fff1f8da-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vlgcl\" (UID: \"27f5540a-27a6-4594-b86c-4118fff1f8da\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vlgcl" Jan 10 16:55:31 crc kubenswrapper[5036]: I0110 16:55:31.009672 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxh48\" (UniqueName: \"kubernetes.io/projected/27f5540a-27a6-4594-b86c-4118fff1f8da-kube-api-access-rxh48\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vlgcl\" (UID: \"27f5540a-27a6-4594-b86c-4118fff1f8da\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vlgcl" Jan 10 16:55:31 crc kubenswrapper[5036]: I0110 16:55:31.009790 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/27f5540a-27a6-4594-b86c-4118fff1f8da-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vlgcl\" (UID: \"27f5540a-27a6-4594-b86c-4118fff1f8da\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vlgcl" Jan 10 16:55:31 crc kubenswrapper[5036]: I0110 16:55:31.009928 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27f5540a-27a6-4594-b86c-4118fff1f8da-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vlgcl\" (UID: \"27f5540a-27a6-4594-b86c-4118fff1f8da\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vlgcl" Jan 10 16:55:31 crc kubenswrapper[5036]: I0110 16:55:31.013939 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/27f5540a-27a6-4594-b86c-4118fff1f8da-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vlgcl\" (UID: \"27f5540a-27a6-4594-b86c-4118fff1f8da\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vlgcl" Jan 10 16:55:31 crc kubenswrapper[5036]: I0110 16:55:31.014197 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27f5540a-27a6-4594-b86c-4118fff1f8da-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vlgcl\" (UID: \"27f5540a-27a6-4594-b86c-4118fff1f8da\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vlgcl" Jan 10 16:55:31 crc kubenswrapper[5036]: I0110 16:55:31.032709 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxh48\" (UniqueName: \"kubernetes.io/projected/27f5540a-27a6-4594-b86c-4118fff1f8da-kube-api-access-rxh48\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-vlgcl\" (UID: \"27f5540a-27a6-4594-b86c-4118fff1f8da\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vlgcl" Jan 10 16:55:31 crc kubenswrapper[5036]: I0110 16:55:31.059053 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vlgcl" Jan 10 16:55:31 crc kubenswrapper[5036]: I0110 16:55:31.637043 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-vlgcl"] Jan 10 16:55:31 crc kubenswrapper[5036]: I0110 16:55:31.678932 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vlgcl" event={"ID":"27f5540a-27a6-4594-b86c-4118fff1f8da","Type":"ContainerStarted","Data":"16d31aaaa7b6405006195e64c0e4bd130709f044abc62bd2f76ff47a02eacfaf"} Jan 10 16:55:32 crc kubenswrapper[5036]: I0110 16:55:32.688570 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vlgcl" event={"ID":"27f5540a-27a6-4594-b86c-4118fff1f8da","Type":"ContainerStarted","Data":"e78f8734699e9ba36030a91cb0b112add8a3c8ac6246e0f6dad2df7cea8ce553"} Jan 10 16:55:32 crc kubenswrapper[5036]: I0110 16:55:32.707991 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vlgcl" podStartSLOduration=2.265976966 podStartE2EDuration="2.707970454s" podCreationTimestamp="2026-01-10 16:55:30 +0000 UTC" firstStartedPulling="2026-01-10 16:55:31.643263807 +0000 UTC m=+1653.513499301" lastFinishedPulling="2026-01-10 16:55:32.085257285 +0000 UTC m=+1653.955492789" observedRunningTime="2026-01-10 16:55:32.707846711 +0000 UTC m=+1654.578082225" watchObservedRunningTime="2026-01-10 16:55:32.707970454 +0000 UTC m=+1654.578205968" Jan 10 16:55:40 crc kubenswrapper[5036]: I0110 16:55:40.039770 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pqkrg"] Jan 10 16:55:40 crc kubenswrapper[5036]: I0110 16:55:40.049068 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pqkrg"] Jan 10 16:55:40 crc kubenswrapper[5036]: I0110 16:55:40.518605 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ff51de8-7e61-4799-b5fc-24e294ec8050" path="/var/lib/kubelet/pods/3ff51de8-7e61-4799-b5fc-24e294ec8050/volumes" Jan 10 16:55:41 crc kubenswrapper[5036]: I0110 16:55:41.761660 5036 generic.go:334] "Generic (PLEG): container finished" podID="27f5540a-27a6-4594-b86c-4118fff1f8da" containerID="e78f8734699e9ba36030a91cb0b112add8a3c8ac6246e0f6dad2df7cea8ce553" exitCode=0 Jan 10 16:55:41 crc kubenswrapper[5036]: I0110 16:55:41.761840 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vlgcl" event={"ID":"27f5540a-27a6-4594-b86c-4118fff1f8da","Type":"ContainerDied","Data":"e78f8734699e9ba36030a91cb0b112add8a3c8ac6246e0f6dad2df7cea8ce553"} Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.200925 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vlgcl" Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.328143 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxh48\" (UniqueName: \"kubernetes.io/projected/27f5540a-27a6-4594-b86c-4118fff1f8da-kube-api-access-rxh48\") pod \"27f5540a-27a6-4594-b86c-4118fff1f8da\" (UID: \"27f5540a-27a6-4594-b86c-4118fff1f8da\") " Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.328354 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/27f5540a-27a6-4594-b86c-4118fff1f8da-ssh-key-openstack-edpm-ipam\") pod \"27f5540a-27a6-4594-b86c-4118fff1f8da\" (UID: \"27f5540a-27a6-4594-b86c-4118fff1f8da\") " Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.328390 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27f5540a-27a6-4594-b86c-4118fff1f8da-inventory\") pod \"27f5540a-27a6-4594-b86c-4118fff1f8da\" (UID: \"27f5540a-27a6-4594-b86c-4118fff1f8da\") " Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.338837 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27f5540a-27a6-4594-b86c-4118fff1f8da-kube-api-access-rxh48" (OuterVolumeSpecName: "kube-api-access-rxh48") pod "27f5540a-27a6-4594-b86c-4118fff1f8da" (UID: "27f5540a-27a6-4594-b86c-4118fff1f8da"). InnerVolumeSpecName "kube-api-access-rxh48". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.358808 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27f5540a-27a6-4594-b86c-4118fff1f8da-inventory" (OuterVolumeSpecName: "inventory") pod "27f5540a-27a6-4594-b86c-4118fff1f8da" (UID: "27f5540a-27a6-4594-b86c-4118fff1f8da"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.364791 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27f5540a-27a6-4594-b86c-4118fff1f8da-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "27f5540a-27a6-4594-b86c-4118fff1f8da" (UID: "27f5540a-27a6-4594-b86c-4118fff1f8da"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.431164 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxh48\" (UniqueName: \"kubernetes.io/projected/27f5540a-27a6-4594-b86c-4118fff1f8da-kube-api-access-rxh48\") on node \"crc\" DevicePath \"\"" Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.431206 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/27f5540a-27a6-4594-b86c-4118fff1f8da-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.431217 5036 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27f5540a-27a6-4594-b86c-4118fff1f8da-inventory\") on node \"crc\" DevicePath \"\"" Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.781777 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vlgcl" event={"ID":"27f5540a-27a6-4594-b86c-4118fff1f8da","Type":"ContainerDied","Data":"16d31aaaa7b6405006195e64c0e4bd130709f044abc62bd2f76ff47a02eacfaf"} Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.781829 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16d31aaaa7b6405006195e64c0e4bd130709f044abc62bd2f76ff47a02eacfaf" Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.781857 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-vlgcl" Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.859675 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb"] Jan 10 16:55:43 crc kubenswrapper[5036]: E0110 16:55:43.860093 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f5540a-27a6-4594-b86c-4118fff1f8da" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.860111 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f5540a-27a6-4594-b86c-4118fff1f8da" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.860299 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="27f5540a-27a6-4594-b86c-4118fff1f8da" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.860978 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb" Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.863545 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.863807 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.864800 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.864922 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.871521 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb"] Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.941162 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/74251251-10f9-409c-968c-c955f24235b2-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb\" (UID: \"74251251-10f9-409c-968c-c955f24235b2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb" Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.941208 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7f2w\" (UniqueName: \"kubernetes.io/projected/74251251-10f9-409c-968c-c955f24235b2-kube-api-access-x7f2w\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb\" (UID: \"74251251-10f9-409c-968c-c955f24235b2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb" Jan 10 16:55:43 crc kubenswrapper[5036]: I0110 16:55:43.941290 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74251251-10f9-409c-968c-c955f24235b2-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb\" (UID: \"74251251-10f9-409c-968c-c955f24235b2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb" Jan 10 16:55:44 crc kubenswrapper[5036]: I0110 16:55:44.043785 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/74251251-10f9-409c-968c-c955f24235b2-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb\" (UID: \"74251251-10f9-409c-968c-c955f24235b2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb" Jan 10 16:55:44 crc kubenswrapper[5036]: I0110 16:55:44.043869 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7f2w\" (UniqueName: \"kubernetes.io/projected/74251251-10f9-409c-968c-c955f24235b2-kube-api-access-x7f2w\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb\" (UID: \"74251251-10f9-409c-968c-c955f24235b2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb" Jan 10 16:55:44 crc kubenswrapper[5036]: I0110 16:55:44.044140 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74251251-10f9-409c-968c-c955f24235b2-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb\" (UID: \"74251251-10f9-409c-968c-c955f24235b2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb" Jan 10 16:55:44 crc kubenswrapper[5036]: I0110 16:55:44.049906 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74251251-10f9-409c-968c-c955f24235b2-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb\" (UID: \"74251251-10f9-409c-968c-c955f24235b2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb" Jan 10 16:55:44 crc kubenswrapper[5036]: I0110 16:55:44.050261 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/74251251-10f9-409c-968c-c955f24235b2-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb\" (UID: \"74251251-10f9-409c-968c-c955f24235b2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb" Jan 10 16:55:44 crc kubenswrapper[5036]: I0110 16:55:44.061938 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7f2w\" (UniqueName: \"kubernetes.io/projected/74251251-10f9-409c-968c-c955f24235b2-kube-api-access-x7f2w\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb\" (UID: \"74251251-10f9-409c-968c-c955f24235b2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb" Jan 10 16:55:44 crc kubenswrapper[5036]: I0110 16:55:44.182556 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb" Jan 10 16:55:44 crc kubenswrapper[5036]: I0110 16:55:44.675429 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb"] Jan 10 16:55:44 crc kubenswrapper[5036]: W0110 16:55:44.679689 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74251251_10f9_409c_968c_c955f24235b2.slice/crio-584675ed44983c54fd1d999caea430bcfb3874de841d82cf92e27e8aeccfe151 WatchSource:0}: Error finding container 584675ed44983c54fd1d999caea430bcfb3874de841d82cf92e27e8aeccfe151: Status 404 returned error can't find the container with id 584675ed44983c54fd1d999caea430bcfb3874de841d82cf92e27e8aeccfe151 Jan 10 16:55:44 crc kubenswrapper[5036]: I0110 16:55:44.790547 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb" event={"ID":"74251251-10f9-409c-968c-c955f24235b2","Type":"ContainerStarted","Data":"584675ed44983c54fd1d999caea430bcfb3874de841d82cf92e27e8aeccfe151"} Jan 10 16:55:45 crc kubenswrapper[5036]: I0110 16:55:45.801013 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb" event={"ID":"74251251-10f9-409c-968c-c955f24235b2","Type":"ContainerStarted","Data":"4fa98bca3cb058e28dc9fe21a78adab57b00c56666815e3d0fd83a5125692672"} Jan 10 16:55:45 crc kubenswrapper[5036]: I0110 16:55:45.825002 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb" podStartSLOduration=2.3244620449999998 podStartE2EDuration="2.824982981s" podCreationTimestamp="2026-01-10 16:55:43 +0000 UTC" firstStartedPulling="2026-01-10 16:55:44.683224108 +0000 UTC m=+1666.553459602" lastFinishedPulling="2026-01-10 16:55:45.183745044 +0000 UTC m=+1667.053980538" observedRunningTime="2026-01-10 16:55:45.821135391 +0000 UTC m=+1667.691370885" watchObservedRunningTime="2026-01-10 16:55:45.824982981 +0000 UTC m=+1667.695218475" Jan 10 16:55:55 crc kubenswrapper[5036]: I0110 16:55:55.885313 5036 generic.go:334] "Generic (PLEG): container finished" podID="74251251-10f9-409c-968c-c955f24235b2" containerID="4fa98bca3cb058e28dc9fe21a78adab57b00c56666815e3d0fd83a5125692672" exitCode=0 Jan 10 16:55:55 crc kubenswrapper[5036]: I0110 16:55:55.885391 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb" event={"ID":"74251251-10f9-409c-968c-c955f24235b2","Type":"ContainerDied","Data":"4fa98bca3cb058e28dc9fe21a78adab57b00c56666815e3d0fd83a5125692672"} Jan 10 16:55:55 crc kubenswrapper[5036]: I0110 16:55:55.905168 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 16:55:55 crc kubenswrapper[5036]: I0110 16:55:55.905522 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 16:55:55 crc kubenswrapper[5036]: I0110 16:55:55.905740 5036 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 16:55:55 crc kubenswrapper[5036]: I0110 16:55:55.906772 5036 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219"} pod="openshift-machine-config-operator/machine-config-daemon-kqphb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 10 16:55:55 crc kubenswrapper[5036]: I0110 16:55:55.907015 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" containerID="cri-o://4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" gracePeriod=600 Jan 10 16:55:56 crc kubenswrapper[5036]: E0110 16:55:56.042860 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 16:55:56 crc kubenswrapper[5036]: I0110 16:55:56.897097 5036 generic.go:334] "Generic (PLEG): container finished" podID="79756361-741e-4470-831b-6ee092bc6277" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" exitCode=0 Jan 10 16:55:56 crc kubenswrapper[5036]: I0110 16:55:56.897248 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerDied","Data":"4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219"} Jan 10 16:55:56 crc kubenswrapper[5036]: I0110 16:55:56.897281 5036 scope.go:117] "RemoveContainer" containerID="1b3cfa0819aeac4502d95e4d9f7b2ee845bbdb656e6de3b7c4e292249ece1785" Jan 10 16:55:56 crc kubenswrapper[5036]: I0110 16:55:56.897775 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 16:55:56 crc kubenswrapper[5036]: E0110 16:55:56.897995 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 16:55:57 crc kubenswrapper[5036]: I0110 16:55:57.751919 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb" Jan 10 16:55:57 crc kubenswrapper[5036]: I0110 16:55:57.890047 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/74251251-10f9-409c-968c-c955f24235b2-ssh-key-openstack-edpm-ipam\") pod \"74251251-10f9-409c-968c-c955f24235b2\" (UID: \"74251251-10f9-409c-968c-c955f24235b2\") " Jan 10 16:55:57 crc kubenswrapper[5036]: I0110 16:55:57.890210 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7f2w\" (UniqueName: \"kubernetes.io/projected/74251251-10f9-409c-968c-c955f24235b2-kube-api-access-x7f2w\") pod \"74251251-10f9-409c-968c-c955f24235b2\" (UID: \"74251251-10f9-409c-968c-c955f24235b2\") " Jan 10 16:55:57 crc kubenswrapper[5036]: I0110 16:55:57.890358 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74251251-10f9-409c-968c-c955f24235b2-inventory\") pod \"74251251-10f9-409c-968c-c955f24235b2\" (UID: \"74251251-10f9-409c-968c-c955f24235b2\") " Jan 10 16:55:57 crc kubenswrapper[5036]: I0110 16:55:57.902047 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74251251-10f9-409c-968c-c955f24235b2-kube-api-access-x7f2w" (OuterVolumeSpecName: "kube-api-access-x7f2w") pod "74251251-10f9-409c-968c-c955f24235b2" (UID: "74251251-10f9-409c-968c-c955f24235b2"). InnerVolumeSpecName "kube-api-access-x7f2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 16:55:57 crc kubenswrapper[5036]: I0110 16:55:57.911449 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb" event={"ID":"74251251-10f9-409c-968c-c955f24235b2","Type":"ContainerDied","Data":"584675ed44983c54fd1d999caea430bcfb3874de841d82cf92e27e8aeccfe151"} Jan 10 16:55:57 crc kubenswrapper[5036]: I0110 16:55:57.911486 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="584675ed44983c54fd1d999caea430bcfb3874de841d82cf92e27e8aeccfe151" Jan 10 16:55:57 crc kubenswrapper[5036]: I0110 16:55:57.911533 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb" Jan 10 16:55:57 crc kubenswrapper[5036]: I0110 16:55:57.930270 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74251251-10f9-409c-968c-c955f24235b2-inventory" (OuterVolumeSpecName: "inventory") pod "74251251-10f9-409c-968c-c955f24235b2" (UID: "74251251-10f9-409c-968c-c955f24235b2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:55:57 crc kubenswrapper[5036]: E0110 16:55:57.934181 5036 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74251251-10f9-409c-968c-c955f24235b2-ssh-key-openstack-edpm-ipam podName:74251251-10f9-409c-968c-c955f24235b2 nodeName:}" failed. No retries permitted until 2026-01-10 16:55:58.434151591 +0000 UTC m=+1680.304387185 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key-openstack-edpm-ipam" (UniqueName: "kubernetes.io/secret/74251251-10f9-409c-968c-c955f24235b2-ssh-key-openstack-edpm-ipam") pod "74251251-10f9-409c-968c-c955f24235b2" (UID: "74251251-10f9-409c-968c-c955f24235b2") : error deleting /var/lib/kubelet/pods/74251251-10f9-409c-968c-c955f24235b2/volume-subpaths: remove /var/lib/kubelet/pods/74251251-10f9-409c-968c-c955f24235b2/volume-subpaths: no such file or directory Jan 10 16:55:57 crc kubenswrapper[5036]: I0110 16:55:57.993083 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7f2w\" (UniqueName: \"kubernetes.io/projected/74251251-10f9-409c-968c-c955f24235b2-kube-api-access-x7f2w\") on node \"crc\" DevicePath \"\"" Jan 10 16:55:57 crc kubenswrapper[5036]: I0110 16:55:57.993120 5036 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74251251-10f9-409c-968c-c955f24235b2-inventory\") on node \"crc\" DevicePath \"\"" Jan 10 16:55:58 crc kubenswrapper[5036]: I0110 16:55:58.503542 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/74251251-10f9-409c-968c-c955f24235b2-ssh-key-openstack-edpm-ipam\") pod \"74251251-10f9-409c-968c-c955f24235b2\" (UID: \"74251251-10f9-409c-968c-c955f24235b2\") " Jan 10 16:55:58 crc kubenswrapper[5036]: I0110 16:55:58.511868 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74251251-10f9-409c-968c-c955f24235b2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "74251251-10f9-409c-968c-c955f24235b2" (UID: "74251251-10f9-409c-968c-c955f24235b2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 16:55:58 crc kubenswrapper[5036]: I0110 16:55:58.606090 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/74251251-10f9-409c-968c-c955f24235b2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 16:56:02 crc kubenswrapper[5036]: I0110 16:56:02.043499 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-lzdzf"] Jan 10 16:56:02 crc kubenswrapper[5036]: I0110 16:56:02.050570 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-lzdzf"] Jan 10 16:56:02 crc kubenswrapper[5036]: I0110 16:56:02.526792 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4596a8b1-1c76-48fd-8c48-ae9adb6f629e" path="/var/lib/kubelet/pods/4596a8b1-1c76-48fd-8c48-ae9adb6f629e/volumes" Jan 10 16:56:05 crc kubenswrapper[5036]: I0110 16:56:05.027339 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-282lj"] Jan 10 16:56:05 crc kubenswrapper[5036]: I0110 16:56:05.033842 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-282lj"] Jan 10 16:56:06 crc kubenswrapper[5036]: I0110 16:56:06.522060 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d1109f9-6187-4b88-bb21-c43f2b25b4ad" path="/var/lib/kubelet/pods/7d1109f9-6187-4b88-bb21-c43f2b25b4ad/volumes" Jan 10 16:56:08 crc kubenswrapper[5036]: I0110 16:56:08.518305 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 16:56:08 crc kubenswrapper[5036]: E0110 16:56:08.518743 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 16:56:20 crc kubenswrapper[5036]: I0110 16:56:20.508586 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 16:56:20 crc kubenswrapper[5036]: E0110 16:56:20.509623 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 16:56:22 crc kubenswrapper[5036]: I0110 16:56:22.091199 5036 scope.go:117] "RemoveContainer" containerID="5083d291349a6cf55709db754c43d52d0f05538b981c428264263991423167a1" Jan 10 16:56:22 crc kubenswrapper[5036]: I0110 16:56:22.157830 5036 scope.go:117] "RemoveContainer" containerID="fa1da74453138d18273be89397bb33897347520e37fcd326e2f85eaf96f7237c" Jan 10 16:56:22 crc kubenswrapper[5036]: I0110 16:56:22.202425 5036 scope.go:117] "RemoveContainer" containerID="a46af9221f40c5ca7a6f8ac24fb548026064eee9253e5e0c792c3663486e9aa2" Jan 10 16:56:35 crc kubenswrapper[5036]: I0110 16:56:35.508795 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 16:56:35 crc kubenswrapper[5036]: E0110 16:56:35.509671 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 16:56:47 crc kubenswrapper[5036]: I0110 16:56:47.047113 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-ttt6f"] Jan 10 16:56:47 crc kubenswrapper[5036]: I0110 16:56:47.060762 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-ttt6f"] Jan 10 16:56:48 crc kubenswrapper[5036]: I0110 16:56:48.518896 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 16:56:48 crc kubenswrapper[5036]: E0110 16:56:48.519803 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 16:56:48 crc kubenswrapper[5036]: I0110 16:56:48.528133 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62ba54f7-a8b0-4836-b983-63702bb4c94d" path="/var/lib/kubelet/pods/62ba54f7-a8b0-4836-b983-63702bb4c94d/volumes" Jan 10 16:56:59 crc kubenswrapper[5036]: I0110 16:56:59.508456 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 16:56:59 crc kubenswrapper[5036]: E0110 16:56:59.509354 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 16:57:11 crc kubenswrapper[5036]: I0110 16:57:11.507627 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 16:57:11 crc kubenswrapper[5036]: E0110 16:57:11.508443 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 16:57:22 crc kubenswrapper[5036]: I0110 16:57:22.339849 5036 scope.go:117] "RemoveContainer" containerID="8691cb49074c75a92edd84be9cdc3691890a2b9ce024486d692d4a9988501753" Jan 10 16:57:22 crc kubenswrapper[5036]: I0110 16:57:22.509087 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 16:57:22 crc kubenswrapper[5036]: E0110 16:57:22.510068 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 16:57:34 crc kubenswrapper[5036]: I0110 16:57:34.508328 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 16:57:34 crc kubenswrapper[5036]: E0110 16:57:34.509112 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 16:57:46 crc kubenswrapper[5036]: I0110 16:57:46.508433 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 16:57:46 crc kubenswrapper[5036]: E0110 16:57:46.509919 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 16:58:00 crc kubenswrapper[5036]: I0110 16:58:00.511289 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 16:58:00 crc kubenswrapper[5036]: E0110 16:58:00.512050 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 16:58:14 crc kubenswrapper[5036]: I0110 16:58:14.507836 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 16:58:14 crc kubenswrapper[5036]: E0110 16:58:14.508860 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 16:58:27 crc kubenswrapper[5036]: I0110 16:58:27.508613 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 16:58:27 crc kubenswrapper[5036]: E0110 16:58:27.509428 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 16:58:42 crc kubenswrapper[5036]: I0110 16:58:42.508202 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 16:58:42 crc kubenswrapper[5036]: E0110 16:58:42.509114 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 16:58:55 crc kubenswrapper[5036]: I0110 16:58:55.508374 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 16:58:55 crc kubenswrapper[5036]: E0110 16:58:55.509294 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 16:59:09 crc kubenswrapper[5036]: I0110 16:59:09.508438 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 16:59:09 crc kubenswrapper[5036]: E0110 16:59:09.509383 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 16:59:22 crc kubenswrapper[5036]: I0110 16:59:22.508130 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 16:59:22 crc kubenswrapper[5036]: E0110 16:59:22.509073 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 16:59:36 crc kubenswrapper[5036]: I0110 16:59:36.508215 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 16:59:36 crc kubenswrapper[5036]: E0110 16:59:36.509257 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 16:59:39 crc kubenswrapper[5036]: I0110 16:59:39.058548 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz"] Jan 10 16:59:39 crc kubenswrapper[5036]: I0110 16:59:39.064332 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg"] Jan 10 16:59:39 crc kubenswrapper[5036]: I0110 16:59:39.070542 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6"] Jan 10 16:59:39 crc kubenswrapper[5036]: I0110 16:59:39.076858 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf"] Jan 10 16:59:39 crc kubenswrapper[5036]: I0110 16:59:39.083407 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l"] Jan 10 16:59:39 crc kubenswrapper[5036]: I0110 16:59:39.089576 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2"] Jan 10 16:59:39 crc kubenswrapper[5036]: I0110 16:59:39.095284 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb"] Jan 10 16:59:39 crc kubenswrapper[5036]: I0110 16:59:39.103306 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-vlgcl"] Jan 10 16:59:39 crc kubenswrapper[5036]: I0110 16:59:39.110629 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cxjg4"] Jan 10 16:59:39 crc kubenswrapper[5036]: I0110 16:59:39.116661 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hf724"] Jan 10 16:59:39 crc kubenswrapper[5036]: I0110 16:59:39.122145 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-5799l"] Jan 10 16:59:39 crc kubenswrapper[5036]: I0110 16:59:39.127761 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-vlgcl"] Jan 10 16:59:39 crc kubenswrapper[5036]: I0110 16:59:39.133479 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kqvpg"] Jan 10 16:59:39 crc kubenswrapper[5036]: I0110 16:59:39.138947 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cxjg4"] Jan 10 16:59:39 crc kubenswrapper[5036]: I0110 16:59:39.144009 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cc8lz"] Jan 10 16:59:39 crc kubenswrapper[5036]: I0110 16:59:39.148957 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-ck7bb"] Jan 10 16:59:39 crc kubenswrapper[5036]: I0110 16:59:39.153836 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pp5zf"] Jan 10 16:59:39 crc kubenswrapper[5036]: I0110 16:59:39.159055 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-c2tl2"] Jan 10 16:59:39 crc kubenswrapper[5036]: I0110 16:59:39.164624 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7mbj6"] Jan 10 16:59:39 crc kubenswrapper[5036]: I0110 16:59:39.170609 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hf724"] Jan 10 16:59:40 crc kubenswrapper[5036]: I0110 16:59:40.526914 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ddc428b-2df4-4b8e-935e-cd07abb35a50" path="/var/lib/kubelet/pods/0ddc428b-2df4-4b8e-935e-cd07abb35a50/volumes" Jan 10 16:59:40 crc kubenswrapper[5036]: I0110 16:59:40.528492 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1801856c-1172-415e-b44c-84437ef2c240" path="/var/lib/kubelet/pods/1801856c-1172-415e-b44c-84437ef2c240/volumes" Jan 10 16:59:40 crc kubenswrapper[5036]: I0110 16:59:40.529351 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bdfee06-51dd-434d-b315-99e079ec9895" path="/var/lib/kubelet/pods/1bdfee06-51dd-434d-b315-99e079ec9895/volumes" Jan 10 16:59:40 crc kubenswrapper[5036]: I0110 16:59:40.530137 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d19244c-8236-481b-9b50-b6a641c7b724" path="/var/lib/kubelet/pods/1d19244c-8236-481b-9b50-b6a641c7b724/volumes" Jan 10 16:59:40 crc kubenswrapper[5036]: I0110 16:59:40.531752 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27f5540a-27a6-4594-b86c-4118fff1f8da" path="/var/lib/kubelet/pods/27f5540a-27a6-4594-b86c-4118fff1f8da/volumes" Jan 10 16:59:40 crc kubenswrapper[5036]: I0110 16:59:40.532522 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3978931b-044e-4415-9e0b-5f16130dccc6" path="/var/lib/kubelet/pods/3978931b-044e-4415-9e0b-5f16130dccc6/volumes" Jan 10 16:59:40 crc kubenswrapper[5036]: I0110 16:59:40.533332 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c5a7464-a4c9-4aed-a25f-1d19266239b4" path="/var/lib/kubelet/pods/3c5a7464-a4c9-4aed-a25f-1d19266239b4/volumes" Jan 10 16:59:40 crc kubenswrapper[5036]: I0110 16:59:40.535948 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45942a81-832d-42f1-b68d-57e92315759c" path="/var/lib/kubelet/pods/45942a81-832d-42f1-b68d-57e92315759c/volumes" Jan 10 16:59:40 crc kubenswrapper[5036]: I0110 16:59:40.536884 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51f2b92e-a982-4894-8f90-b500b35e1016" path="/var/lib/kubelet/pods/51f2b92e-a982-4894-8f90-b500b35e1016/volumes" Jan 10 16:59:40 crc kubenswrapper[5036]: I0110 16:59:40.538282 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74251251-10f9-409c-968c-c955f24235b2" path="/var/lib/kubelet/pods/74251251-10f9-409c-968c-c955f24235b2/volumes" Jan 10 16:59:44 crc kubenswrapper[5036]: I0110 16:59:44.902241 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx"] Jan 10 16:59:44 crc kubenswrapper[5036]: E0110 16:59:44.903527 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74251251-10f9-409c-968c-c955f24235b2" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 10 16:59:44 crc kubenswrapper[5036]: I0110 16:59:44.903544 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="74251251-10f9-409c-968c-c955f24235b2" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 10 16:59:44 crc kubenswrapper[5036]: I0110 16:59:44.903745 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="74251251-10f9-409c-968c-c955f24235b2" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 10 16:59:44 crc kubenswrapper[5036]: I0110 16:59:44.905540 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx" Jan 10 16:59:44 crc kubenswrapper[5036]: I0110 16:59:44.907570 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 16:59:44 crc kubenswrapper[5036]: I0110 16:59:44.908253 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 16:59:44 crc kubenswrapper[5036]: I0110 16:59:44.909503 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 10 16:59:44 crc kubenswrapper[5036]: I0110 16:59:44.909543 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 16:59:44 crc kubenswrapper[5036]: I0110 16:59:44.921834 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx"] Jan 10 16:59:44 crc kubenswrapper[5036]: I0110 16:59:44.958021 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 16:59:45 crc kubenswrapper[5036]: I0110 16:59:45.009846 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e48f105-5183-4dd8-94d9-8a8636ca4c82-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx\" (UID: \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx" Jan 10 16:59:45 crc kubenswrapper[5036]: I0110 16:59:45.009921 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e48f105-5183-4dd8-94d9-8a8636ca4c82-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx\" (UID: \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx" Jan 10 16:59:45 crc kubenswrapper[5036]: I0110 16:59:45.009963 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e48f105-5183-4dd8-94d9-8a8636ca4c82-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx\" (UID: \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx" Jan 10 16:59:45 crc kubenswrapper[5036]: I0110 16:59:45.010104 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cf4w\" (UniqueName: \"kubernetes.io/projected/8e48f105-5183-4dd8-94d9-8a8636ca4c82-kube-api-access-6cf4w\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx\" (UID: \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx" Jan 10 16:59:45 crc kubenswrapper[5036]: I0110 16:59:45.010141 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e48f105-5183-4dd8-94d9-8a8636ca4c82-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx\" (UID: \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx" Jan 10 16:59:45 crc kubenswrapper[5036]: I0110 16:59:45.113130 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e48f105-5183-4dd8-94d9-8a8636ca4c82-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx\" (UID: \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx" Jan 10 16:59:45 crc kubenswrapper[5036]: I0110 16:59:45.113213 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e48f105-5183-4dd8-94d9-8a8636ca4c82-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx\" (UID: \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx" Jan 10 16:59:45 crc kubenswrapper[5036]: I0110 16:59:45.113241 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e48f105-5183-4dd8-94d9-8a8636ca4c82-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx\" (UID: \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx" Jan 10 16:59:45 crc kubenswrapper[5036]: I0110 16:59:45.113312 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cf4w\" (UniqueName: \"kubernetes.io/projected/8e48f105-5183-4dd8-94d9-8a8636ca4c82-kube-api-access-6cf4w\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx\" (UID: \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx" Jan 10 16:59:45 crc kubenswrapper[5036]: I0110 16:59:45.113338 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e48f105-5183-4dd8-94d9-8a8636ca4c82-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx\" (UID: \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx" Jan 10 16:59:45 crc kubenswrapper[5036]: I0110 16:59:45.120367 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e48f105-5183-4dd8-94d9-8a8636ca4c82-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx\" (UID: \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx" Jan 10 16:59:45 crc kubenswrapper[5036]: I0110 16:59:45.122284 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e48f105-5183-4dd8-94d9-8a8636ca4c82-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx\" (UID: \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx" Jan 10 16:59:45 crc kubenswrapper[5036]: I0110 16:59:45.122668 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e48f105-5183-4dd8-94d9-8a8636ca4c82-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx\" (UID: \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx" Jan 10 16:59:45 crc kubenswrapper[5036]: I0110 16:59:45.135313 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e48f105-5183-4dd8-94d9-8a8636ca4c82-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx\" (UID: \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx" Jan 10 16:59:45 crc kubenswrapper[5036]: I0110 16:59:45.145004 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cf4w\" (UniqueName: \"kubernetes.io/projected/8e48f105-5183-4dd8-94d9-8a8636ca4c82-kube-api-access-6cf4w\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx\" (UID: \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx" Jan 10 16:59:45 crc kubenswrapper[5036]: I0110 16:59:45.270947 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx" Jan 10 16:59:45 crc kubenswrapper[5036]: I0110 16:59:45.790088 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx"] Jan 10 16:59:45 crc kubenswrapper[5036]: W0110 16:59:45.798520 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e48f105_5183_4dd8_94d9_8a8636ca4c82.slice/crio-6950e480910607d3f46f60223e529dbb405de74dad0a4adc68767fb3a89d854a WatchSource:0}: Error finding container 6950e480910607d3f46f60223e529dbb405de74dad0a4adc68767fb3a89d854a: Status 404 returned error can't find the container with id 6950e480910607d3f46f60223e529dbb405de74dad0a4adc68767fb3a89d854a Jan 10 16:59:45 crc kubenswrapper[5036]: I0110 16:59:45.801834 5036 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 10 16:59:46 crc kubenswrapper[5036]: I0110 16:59:46.263974 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx" event={"ID":"8e48f105-5183-4dd8-94d9-8a8636ca4c82","Type":"ContainerStarted","Data":"6950e480910607d3f46f60223e529dbb405de74dad0a4adc68767fb3a89d854a"} Jan 10 16:59:47 crc kubenswrapper[5036]: I0110 16:59:47.282252 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx" event={"ID":"8e48f105-5183-4dd8-94d9-8a8636ca4c82","Type":"ContainerStarted","Data":"18b41cc52597be4367dd8a1dd6b3859a1f407b39f9ed94be32eade4955be5555"} Jan 10 16:59:47 crc kubenswrapper[5036]: I0110 16:59:47.304374 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx" podStartSLOduration=2.814412298 podStartE2EDuration="3.304347156s" podCreationTimestamp="2026-01-10 16:59:44 +0000 UTC" firstStartedPulling="2026-01-10 16:59:45.801621322 +0000 UTC m=+1907.671856816" lastFinishedPulling="2026-01-10 16:59:46.29155618 +0000 UTC m=+1908.161791674" observedRunningTime="2026-01-10 16:59:47.300443334 +0000 UTC m=+1909.170678848" watchObservedRunningTime="2026-01-10 16:59:47.304347156 +0000 UTC m=+1909.174582660" Jan 10 16:59:47 crc kubenswrapper[5036]: I0110 16:59:47.507604 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 16:59:47 crc kubenswrapper[5036]: E0110 16:59:47.507885 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 16:59:49 crc kubenswrapper[5036]: I0110 16:59:49.576192 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tk92d"] Jan 10 16:59:49 crc kubenswrapper[5036]: I0110 16:59:49.578762 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tk92d" Jan 10 16:59:49 crc kubenswrapper[5036]: I0110 16:59:49.585416 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tk92d"] Jan 10 16:59:49 crc kubenswrapper[5036]: I0110 16:59:49.693657 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f52cd7a2-e44b-4e73-a390-3aafa94cb84f-utilities\") pod \"certified-operators-tk92d\" (UID: \"f52cd7a2-e44b-4e73-a390-3aafa94cb84f\") " pod="openshift-marketplace/certified-operators-tk92d" Jan 10 16:59:49 crc kubenswrapper[5036]: I0110 16:59:49.693782 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqx67\" (UniqueName: \"kubernetes.io/projected/f52cd7a2-e44b-4e73-a390-3aafa94cb84f-kube-api-access-xqx67\") pod \"certified-operators-tk92d\" (UID: \"f52cd7a2-e44b-4e73-a390-3aafa94cb84f\") " pod="openshift-marketplace/certified-operators-tk92d" Jan 10 16:59:49 crc kubenswrapper[5036]: I0110 16:59:49.693823 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f52cd7a2-e44b-4e73-a390-3aafa94cb84f-catalog-content\") pod \"certified-operators-tk92d\" (UID: \"f52cd7a2-e44b-4e73-a390-3aafa94cb84f\") " pod="openshift-marketplace/certified-operators-tk92d" Jan 10 16:59:49 crc kubenswrapper[5036]: I0110 16:59:49.795496 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqx67\" (UniqueName: \"kubernetes.io/projected/f52cd7a2-e44b-4e73-a390-3aafa94cb84f-kube-api-access-xqx67\") pod \"certified-operators-tk92d\" (UID: \"f52cd7a2-e44b-4e73-a390-3aafa94cb84f\") " pod="openshift-marketplace/certified-operators-tk92d" Jan 10 16:59:49 crc kubenswrapper[5036]: I0110 16:59:49.795574 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f52cd7a2-e44b-4e73-a390-3aafa94cb84f-catalog-content\") pod \"certified-operators-tk92d\" (UID: \"f52cd7a2-e44b-4e73-a390-3aafa94cb84f\") " pod="openshift-marketplace/certified-operators-tk92d" Jan 10 16:59:49 crc kubenswrapper[5036]: I0110 16:59:49.795727 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f52cd7a2-e44b-4e73-a390-3aafa94cb84f-utilities\") pod \"certified-operators-tk92d\" (UID: \"f52cd7a2-e44b-4e73-a390-3aafa94cb84f\") " pod="openshift-marketplace/certified-operators-tk92d" Jan 10 16:59:49 crc kubenswrapper[5036]: I0110 16:59:49.796340 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f52cd7a2-e44b-4e73-a390-3aafa94cb84f-catalog-content\") pod \"certified-operators-tk92d\" (UID: \"f52cd7a2-e44b-4e73-a390-3aafa94cb84f\") " pod="openshift-marketplace/certified-operators-tk92d" Jan 10 16:59:49 crc kubenswrapper[5036]: I0110 16:59:49.796372 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f52cd7a2-e44b-4e73-a390-3aafa94cb84f-utilities\") pod \"certified-operators-tk92d\" (UID: \"f52cd7a2-e44b-4e73-a390-3aafa94cb84f\") " pod="openshift-marketplace/certified-operators-tk92d" Jan 10 16:59:49 crc kubenswrapper[5036]: I0110 16:59:49.821973 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqx67\" (UniqueName: \"kubernetes.io/projected/f52cd7a2-e44b-4e73-a390-3aafa94cb84f-kube-api-access-xqx67\") pod \"certified-operators-tk92d\" (UID: \"f52cd7a2-e44b-4e73-a390-3aafa94cb84f\") " pod="openshift-marketplace/certified-operators-tk92d" Jan 10 16:59:49 crc kubenswrapper[5036]: I0110 16:59:49.897966 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tk92d" Jan 10 16:59:50 crc kubenswrapper[5036]: I0110 16:59:50.356625 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tk92d"] Jan 10 16:59:50 crc kubenswrapper[5036]: W0110 16:59:50.360872 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf52cd7a2_e44b_4e73_a390_3aafa94cb84f.slice/crio-7c3571755b2be84685498ca9433cf8b0a33dd290d7fad27f563b276298fb128e WatchSource:0}: Error finding container 7c3571755b2be84685498ca9433cf8b0a33dd290d7fad27f563b276298fb128e: Status 404 returned error can't find the container with id 7c3571755b2be84685498ca9433cf8b0a33dd290d7fad27f563b276298fb128e Jan 10 16:59:51 crc kubenswrapper[5036]: I0110 16:59:51.327539 5036 generic.go:334] "Generic (PLEG): container finished" podID="f52cd7a2-e44b-4e73-a390-3aafa94cb84f" containerID="41c91134f7085bc1b858b1832000824575af4fcd3b2a3207190e3d639d795f4b" exitCode=0 Jan 10 16:59:51 crc kubenswrapper[5036]: I0110 16:59:51.327587 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tk92d" event={"ID":"f52cd7a2-e44b-4e73-a390-3aafa94cb84f","Type":"ContainerDied","Data":"41c91134f7085bc1b858b1832000824575af4fcd3b2a3207190e3d639d795f4b"} Jan 10 16:59:51 crc kubenswrapper[5036]: I0110 16:59:51.327616 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tk92d" event={"ID":"f52cd7a2-e44b-4e73-a390-3aafa94cb84f","Type":"ContainerStarted","Data":"7c3571755b2be84685498ca9433cf8b0a33dd290d7fad27f563b276298fb128e"} Jan 10 16:59:52 crc kubenswrapper[5036]: I0110 16:59:52.346260 5036 generic.go:334] "Generic (PLEG): container finished" podID="f52cd7a2-e44b-4e73-a390-3aafa94cb84f" containerID="3a7ea2ed0ccfa5a5e26fe11aff456f2eb2ed716bf9ead7c5fd09b3e06a0ef5b3" exitCode=0 Jan 10 16:59:52 crc kubenswrapper[5036]: I0110 16:59:52.346399 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tk92d" event={"ID":"f52cd7a2-e44b-4e73-a390-3aafa94cb84f","Type":"ContainerDied","Data":"3a7ea2ed0ccfa5a5e26fe11aff456f2eb2ed716bf9ead7c5fd09b3e06a0ef5b3"} Jan 10 16:59:53 crc kubenswrapper[5036]: I0110 16:59:53.374032 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tk92d" event={"ID":"f52cd7a2-e44b-4e73-a390-3aafa94cb84f","Type":"ContainerStarted","Data":"cccbdddc64330a0d21a1f8e1830b7e2e8817f0e807cddc1b212e252ca5828908"} Jan 10 16:59:53 crc kubenswrapper[5036]: I0110 16:59:53.410475 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tk92d" podStartSLOduration=2.8764912799999998 podStartE2EDuration="4.410457968s" podCreationTimestamp="2026-01-10 16:59:49 +0000 UTC" firstStartedPulling="2026-01-10 16:59:51.329285804 +0000 UTC m=+1913.199521308" lastFinishedPulling="2026-01-10 16:59:52.863252492 +0000 UTC m=+1914.733487996" observedRunningTime="2026-01-10 16:59:53.401725658 +0000 UTC m=+1915.271961152" watchObservedRunningTime="2026-01-10 16:59:53.410457968 +0000 UTC m=+1915.280693462" Jan 10 16:59:58 crc kubenswrapper[5036]: I0110 16:59:58.517291 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 16:59:58 crc kubenswrapper[5036]: E0110 16:59:58.518287 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 16:59:59 crc kubenswrapper[5036]: I0110 16:59:59.442948 5036 generic.go:334] "Generic (PLEG): container finished" podID="8e48f105-5183-4dd8-94d9-8a8636ca4c82" containerID="18b41cc52597be4367dd8a1dd6b3859a1f407b39f9ed94be32eade4955be5555" exitCode=0 Jan 10 16:59:59 crc kubenswrapper[5036]: I0110 16:59:59.442997 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx" event={"ID":"8e48f105-5183-4dd8-94d9-8a8636ca4c82","Type":"ContainerDied","Data":"18b41cc52597be4367dd8a1dd6b3859a1f407b39f9ed94be32eade4955be5555"} Jan 10 16:59:59 crc kubenswrapper[5036]: I0110 16:59:59.898139 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tk92d" Jan 10 16:59:59 crc kubenswrapper[5036]: I0110 16:59:59.898206 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tk92d" Jan 10 16:59:59 crc kubenswrapper[5036]: I0110 16:59:59.958818 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tk92d" Jan 10 17:00:00 crc kubenswrapper[5036]: I0110 17:00:00.168968 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29467740-xnkqg"] Jan 10 17:00:00 crc kubenswrapper[5036]: I0110 17:00:00.170723 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29467740-xnkqg" Jan 10 17:00:00 crc kubenswrapper[5036]: I0110 17:00:00.176069 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 10 17:00:00 crc kubenswrapper[5036]: I0110 17:00:00.176342 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 10 17:00:00 crc kubenswrapper[5036]: I0110 17:00:00.184813 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29467740-xnkqg"] Jan 10 17:00:00 crc kubenswrapper[5036]: I0110 17:00:00.305162 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwv85\" (UniqueName: \"kubernetes.io/projected/26cfb13d-e77a-4592-9668-34473b8379af-kube-api-access-dwv85\") pod \"collect-profiles-29467740-xnkqg\" (UID: \"26cfb13d-e77a-4592-9668-34473b8379af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467740-xnkqg" Jan 10 17:00:00 crc kubenswrapper[5036]: I0110 17:00:00.305273 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26cfb13d-e77a-4592-9668-34473b8379af-config-volume\") pod \"collect-profiles-29467740-xnkqg\" (UID: \"26cfb13d-e77a-4592-9668-34473b8379af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467740-xnkqg" Jan 10 17:00:00 crc kubenswrapper[5036]: I0110 17:00:00.305295 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26cfb13d-e77a-4592-9668-34473b8379af-secret-volume\") pod \"collect-profiles-29467740-xnkqg\" (UID: \"26cfb13d-e77a-4592-9668-34473b8379af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467740-xnkqg" Jan 10 17:00:00 crc kubenswrapper[5036]: I0110 17:00:00.407364 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26cfb13d-e77a-4592-9668-34473b8379af-secret-volume\") pod \"collect-profiles-29467740-xnkqg\" (UID: \"26cfb13d-e77a-4592-9668-34473b8379af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467740-xnkqg" Jan 10 17:00:00 crc kubenswrapper[5036]: I0110 17:00:00.407505 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwv85\" (UniqueName: \"kubernetes.io/projected/26cfb13d-e77a-4592-9668-34473b8379af-kube-api-access-dwv85\") pod \"collect-profiles-29467740-xnkqg\" (UID: \"26cfb13d-e77a-4592-9668-34473b8379af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467740-xnkqg" Jan 10 17:00:00 crc kubenswrapper[5036]: I0110 17:00:00.407653 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26cfb13d-e77a-4592-9668-34473b8379af-config-volume\") pod \"collect-profiles-29467740-xnkqg\" (UID: \"26cfb13d-e77a-4592-9668-34473b8379af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467740-xnkqg" Jan 10 17:00:00 crc kubenswrapper[5036]: I0110 17:00:00.408976 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26cfb13d-e77a-4592-9668-34473b8379af-config-volume\") pod \"collect-profiles-29467740-xnkqg\" (UID: \"26cfb13d-e77a-4592-9668-34473b8379af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467740-xnkqg" Jan 10 17:00:00 crc kubenswrapper[5036]: I0110 17:00:00.415047 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26cfb13d-e77a-4592-9668-34473b8379af-secret-volume\") pod \"collect-profiles-29467740-xnkqg\" (UID: \"26cfb13d-e77a-4592-9668-34473b8379af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467740-xnkqg" Jan 10 17:00:00 crc kubenswrapper[5036]: I0110 17:00:00.428159 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwv85\" (UniqueName: \"kubernetes.io/projected/26cfb13d-e77a-4592-9668-34473b8379af-kube-api-access-dwv85\") pod \"collect-profiles-29467740-xnkqg\" (UID: \"26cfb13d-e77a-4592-9668-34473b8379af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467740-xnkqg" Jan 10 17:00:00 crc kubenswrapper[5036]: I0110 17:00:00.511692 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29467740-xnkqg" Jan 10 17:00:00 crc kubenswrapper[5036]: I0110 17:00:00.521869 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tk92d" Jan 10 17:00:00 crc kubenswrapper[5036]: I0110 17:00:00.573845 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tk92d"] Jan 10 17:00:00 crc kubenswrapper[5036]: I0110 17:00:00.948525 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:00.999837 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29467740-xnkqg"] Jan 10 17:00:01 crc kubenswrapper[5036]: W0110 17:00:01.008381 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26cfb13d_e77a_4592_9668_34473b8379af.slice/crio-2ce2055203fc356dadd80f12ea3d5d2f860aa0054b55cec2ed3a2575d9fe366a WatchSource:0}: Error finding container 2ce2055203fc356dadd80f12ea3d5d2f860aa0054b55cec2ed3a2575d9fe366a: Status 404 returned error can't find the container with id 2ce2055203fc356dadd80f12ea3d5d2f860aa0054b55cec2ed3a2575d9fe366a Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.019794 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e48f105-5183-4dd8-94d9-8a8636ca4c82-ceph\") pod \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\" (UID: \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\") " Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.019965 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e48f105-5183-4dd8-94d9-8a8636ca4c82-ssh-key-openstack-edpm-ipam\") pod \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\" (UID: \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\") " Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.020024 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e48f105-5183-4dd8-94d9-8a8636ca4c82-repo-setup-combined-ca-bundle\") pod \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\" (UID: \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\") " Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.020098 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e48f105-5183-4dd8-94d9-8a8636ca4c82-inventory\") pod \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\" (UID: \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\") " Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.020135 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cf4w\" (UniqueName: \"kubernetes.io/projected/8e48f105-5183-4dd8-94d9-8a8636ca4c82-kube-api-access-6cf4w\") pod \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\" (UID: \"8e48f105-5183-4dd8-94d9-8a8636ca4c82\") " Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.026171 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e48f105-5183-4dd8-94d9-8a8636ca4c82-kube-api-access-6cf4w" (OuterVolumeSpecName: "kube-api-access-6cf4w") pod "8e48f105-5183-4dd8-94d9-8a8636ca4c82" (UID: "8e48f105-5183-4dd8-94d9-8a8636ca4c82"). InnerVolumeSpecName "kube-api-access-6cf4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.026292 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e48f105-5183-4dd8-94d9-8a8636ca4c82-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "8e48f105-5183-4dd8-94d9-8a8636ca4c82" (UID: "8e48f105-5183-4dd8-94d9-8a8636ca4c82"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.026528 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e48f105-5183-4dd8-94d9-8a8636ca4c82-ceph" (OuterVolumeSpecName: "ceph") pod "8e48f105-5183-4dd8-94d9-8a8636ca4c82" (UID: "8e48f105-5183-4dd8-94d9-8a8636ca4c82"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.048857 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e48f105-5183-4dd8-94d9-8a8636ca4c82-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8e48f105-5183-4dd8-94d9-8a8636ca4c82" (UID: "8e48f105-5183-4dd8-94d9-8a8636ca4c82"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.048893 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e48f105-5183-4dd8-94d9-8a8636ca4c82-inventory" (OuterVolumeSpecName: "inventory") pod "8e48f105-5183-4dd8-94d9-8a8636ca4c82" (UID: "8e48f105-5183-4dd8-94d9-8a8636ca4c82"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.121718 5036 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e48f105-5183-4dd8-94d9-8a8636ca4c82-ceph\") on node \"crc\" DevicePath \"\"" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.121751 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8e48f105-5183-4dd8-94d9-8a8636ca4c82-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.121764 5036 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e48f105-5183-4dd8-94d9-8a8636ca4c82-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.121776 5036 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e48f105-5183-4dd8-94d9-8a8636ca4c82-inventory\") on node \"crc\" DevicePath \"\"" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.121787 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cf4w\" (UniqueName: \"kubernetes.io/projected/8e48f105-5183-4dd8-94d9-8a8636ca4c82-kube-api-access-6cf4w\") on node \"crc\" DevicePath \"\"" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.463886 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx" event={"ID":"8e48f105-5183-4dd8-94d9-8a8636ca4c82","Type":"ContainerDied","Data":"6950e480910607d3f46f60223e529dbb405de74dad0a4adc68767fb3a89d854a"} Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.463930 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.463941 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6950e480910607d3f46f60223e529dbb405de74dad0a4adc68767fb3a89d854a" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.465403 5036 generic.go:334] "Generic (PLEG): container finished" podID="26cfb13d-e77a-4592-9668-34473b8379af" containerID="b1e371b80c152ef1f38bffd8f62cdad7521cd35493e7b0b1ea94a87028205164" exitCode=0 Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.465448 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29467740-xnkqg" event={"ID":"26cfb13d-e77a-4592-9668-34473b8379af","Type":"ContainerDied","Data":"b1e371b80c152ef1f38bffd8f62cdad7521cd35493e7b0b1ea94a87028205164"} Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.465521 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29467740-xnkqg" event={"ID":"26cfb13d-e77a-4592-9668-34473b8379af","Type":"ContainerStarted","Data":"2ce2055203fc356dadd80f12ea3d5d2f860aa0054b55cec2ed3a2575d9fe366a"} Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.540975 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz"] Jan 10 17:00:01 crc kubenswrapper[5036]: E0110 17:00:01.541395 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e48f105-5183-4dd8-94d9-8a8636ca4c82" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.541421 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e48f105-5183-4dd8-94d9-8a8636ca4c82" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.541735 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e48f105-5183-4dd8-94d9-8a8636ca4c82" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.542626 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.545502 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.545773 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.545992 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.546431 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.548620 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.558052 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz"] Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.630183 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz\" (UID: \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.630235 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz\" (UID: \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.630399 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqf6p\" (UniqueName: \"kubernetes.io/projected/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-kube-api-access-dqf6p\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz\" (UID: \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.630486 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz\" (UID: \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.630823 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz\" (UID: \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.732514 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz\" (UID: \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.732802 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz\" (UID: \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.732842 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqf6p\" (UniqueName: \"kubernetes.io/projected/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-kube-api-access-dqf6p\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz\" (UID: \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.732865 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz\" (UID: \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.732919 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz\" (UID: \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.738406 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz\" (UID: \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.738421 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz\" (UID: \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.739107 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz\" (UID: \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.739472 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz\" (UID: \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.751986 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqf6p\" (UniqueName: \"kubernetes.io/projected/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-kube-api-access-dqf6p\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz\" (UID: \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz" Jan 10 17:00:01 crc kubenswrapper[5036]: I0110 17:00:01.864640 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz" Jan 10 17:00:02 crc kubenswrapper[5036]: I0110 17:00:02.447857 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz"] Jan 10 17:00:02 crc kubenswrapper[5036]: W0110 17:00:02.447917 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9f0ccdb_1434_4bd0_90e1_d9314c8d716f.slice/crio-d08445a8c9308c3bc1f6ff1b2fd683f030e5d63a933c19a2e3ca1c90538ad178 WatchSource:0}: Error finding container d08445a8c9308c3bc1f6ff1b2fd683f030e5d63a933c19a2e3ca1c90538ad178: Status 404 returned error can't find the container with id d08445a8c9308c3bc1f6ff1b2fd683f030e5d63a933c19a2e3ca1c90538ad178 Jan 10 17:00:02 crc kubenswrapper[5036]: I0110 17:00:02.480430 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz" event={"ID":"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f","Type":"ContainerStarted","Data":"d08445a8c9308c3bc1f6ff1b2fd683f030e5d63a933c19a2e3ca1c90538ad178"} Jan 10 17:00:02 crc kubenswrapper[5036]: I0110 17:00:02.480805 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tk92d" podUID="f52cd7a2-e44b-4e73-a390-3aafa94cb84f" containerName="registry-server" containerID="cri-o://cccbdddc64330a0d21a1f8e1830b7e2e8817f0e807cddc1b212e252ca5828908" gracePeriod=2 Jan 10 17:00:02 crc kubenswrapper[5036]: I0110 17:00:02.894983 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29467740-xnkqg" Jan 10 17:00:02 crc kubenswrapper[5036]: I0110 17:00:02.949177 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tk92d" Jan 10 17:00:02 crc kubenswrapper[5036]: I0110 17:00:02.962131 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26cfb13d-e77a-4592-9668-34473b8379af-secret-volume\") pod \"26cfb13d-e77a-4592-9668-34473b8379af\" (UID: \"26cfb13d-e77a-4592-9668-34473b8379af\") " Jan 10 17:00:02 crc kubenswrapper[5036]: I0110 17:00:02.962184 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26cfb13d-e77a-4592-9668-34473b8379af-config-volume\") pod \"26cfb13d-e77a-4592-9668-34473b8379af\" (UID: \"26cfb13d-e77a-4592-9668-34473b8379af\") " Jan 10 17:00:02 crc kubenswrapper[5036]: I0110 17:00:02.962228 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwv85\" (UniqueName: \"kubernetes.io/projected/26cfb13d-e77a-4592-9668-34473b8379af-kube-api-access-dwv85\") pod \"26cfb13d-e77a-4592-9668-34473b8379af\" (UID: \"26cfb13d-e77a-4592-9668-34473b8379af\") " Jan 10 17:00:02 crc kubenswrapper[5036]: I0110 17:00:02.964550 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26cfb13d-e77a-4592-9668-34473b8379af-config-volume" (OuterVolumeSpecName: "config-volume") pod "26cfb13d-e77a-4592-9668-34473b8379af" (UID: "26cfb13d-e77a-4592-9668-34473b8379af"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 17:00:02 crc kubenswrapper[5036]: I0110 17:00:02.966776 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26cfb13d-e77a-4592-9668-34473b8379af-kube-api-access-dwv85" (OuterVolumeSpecName: "kube-api-access-dwv85") pod "26cfb13d-e77a-4592-9668-34473b8379af" (UID: "26cfb13d-e77a-4592-9668-34473b8379af"). InnerVolumeSpecName "kube-api-access-dwv85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:00:02 crc kubenswrapper[5036]: I0110 17:00:02.978857 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26cfb13d-e77a-4592-9668-34473b8379af-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "26cfb13d-e77a-4592-9668-34473b8379af" (UID: "26cfb13d-e77a-4592-9668-34473b8379af"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.067311 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqx67\" (UniqueName: \"kubernetes.io/projected/f52cd7a2-e44b-4e73-a390-3aafa94cb84f-kube-api-access-xqx67\") pod \"f52cd7a2-e44b-4e73-a390-3aafa94cb84f\" (UID: \"f52cd7a2-e44b-4e73-a390-3aafa94cb84f\") " Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.067548 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f52cd7a2-e44b-4e73-a390-3aafa94cb84f-utilities\") pod \"f52cd7a2-e44b-4e73-a390-3aafa94cb84f\" (UID: \"f52cd7a2-e44b-4e73-a390-3aafa94cb84f\") " Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.067628 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f52cd7a2-e44b-4e73-a390-3aafa94cb84f-catalog-content\") pod \"f52cd7a2-e44b-4e73-a390-3aafa94cb84f\" (UID: \"f52cd7a2-e44b-4e73-a390-3aafa94cb84f\") " Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.068303 5036 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26cfb13d-e77a-4592-9668-34473b8379af-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.068331 5036 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26cfb13d-e77a-4592-9668-34473b8379af-config-volume\") on node \"crc\" DevicePath \"\"" Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.068346 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwv85\" (UniqueName: \"kubernetes.io/projected/26cfb13d-e77a-4592-9668-34473b8379af-kube-api-access-dwv85\") on node \"crc\" DevicePath \"\"" Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.069270 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f52cd7a2-e44b-4e73-a390-3aafa94cb84f-utilities" (OuterVolumeSpecName: "utilities") pod "f52cd7a2-e44b-4e73-a390-3aafa94cb84f" (UID: "f52cd7a2-e44b-4e73-a390-3aafa94cb84f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.071993 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f52cd7a2-e44b-4e73-a390-3aafa94cb84f-kube-api-access-xqx67" (OuterVolumeSpecName: "kube-api-access-xqx67") pod "f52cd7a2-e44b-4e73-a390-3aafa94cb84f" (UID: "f52cd7a2-e44b-4e73-a390-3aafa94cb84f"). InnerVolumeSpecName "kube-api-access-xqx67". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.123759 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f52cd7a2-e44b-4e73-a390-3aafa94cb84f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f52cd7a2-e44b-4e73-a390-3aafa94cb84f" (UID: "f52cd7a2-e44b-4e73-a390-3aafa94cb84f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.169967 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f52cd7a2-e44b-4e73-a390-3aafa94cb84f-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.170020 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f52cd7a2-e44b-4e73-a390-3aafa94cb84f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.170040 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqx67\" (UniqueName: \"kubernetes.io/projected/f52cd7a2-e44b-4e73-a390-3aafa94cb84f-kube-api-access-xqx67\") on node \"crc\" DevicePath \"\"" Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.490784 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz" event={"ID":"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f","Type":"ContainerStarted","Data":"0eac4af577bae30bf5ce961c532cba0ebae91cd83e673f3d5fbb527b7de45059"} Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.492228 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29467740-xnkqg" Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.492245 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29467740-xnkqg" event={"ID":"26cfb13d-e77a-4592-9668-34473b8379af","Type":"ContainerDied","Data":"2ce2055203fc356dadd80f12ea3d5d2f860aa0054b55cec2ed3a2575d9fe366a"} Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.492286 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ce2055203fc356dadd80f12ea3d5d2f860aa0054b55cec2ed3a2575d9fe366a" Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.494359 5036 generic.go:334] "Generic (PLEG): container finished" podID="f52cd7a2-e44b-4e73-a390-3aafa94cb84f" containerID="cccbdddc64330a0d21a1f8e1830b7e2e8817f0e807cddc1b212e252ca5828908" exitCode=0 Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.494398 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tk92d" event={"ID":"f52cd7a2-e44b-4e73-a390-3aafa94cb84f","Type":"ContainerDied","Data":"cccbdddc64330a0d21a1f8e1830b7e2e8817f0e807cddc1b212e252ca5828908"} Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.494421 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tk92d" event={"ID":"f52cd7a2-e44b-4e73-a390-3aafa94cb84f","Type":"ContainerDied","Data":"7c3571755b2be84685498ca9433cf8b0a33dd290d7fad27f563b276298fb128e"} Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.494424 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tk92d" Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.494443 5036 scope.go:117] "RemoveContainer" containerID="cccbdddc64330a0d21a1f8e1830b7e2e8817f0e807cddc1b212e252ca5828908" Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.510883 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz" podStartSLOduration=1.8624929319999999 podStartE2EDuration="2.51086667s" podCreationTimestamp="2026-01-10 17:00:01 +0000 UTC" firstStartedPulling="2026-01-10 17:00:02.450382389 +0000 UTC m=+1924.320617893" lastFinishedPulling="2026-01-10 17:00:03.098756127 +0000 UTC m=+1924.968991631" observedRunningTime="2026-01-10 17:00:03.509591193 +0000 UTC m=+1925.379826687" watchObservedRunningTime="2026-01-10 17:00:03.51086667 +0000 UTC m=+1925.381102164" Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.519135 5036 scope.go:117] "RemoveContainer" containerID="3a7ea2ed0ccfa5a5e26fe11aff456f2eb2ed716bf9ead7c5fd09b3e06a0ef5b3" Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.537794 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tk92d"] Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.546279 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tk92d"] Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.555456 5036 scope.go:117] "RemoveContainer" containerID="41c91134f7085bc1b858b1832000824575af4fcd3b2a3207190e3d639d795f4b" Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.594543 5036 scope.go:117] "RemoveContainer" containerID="cccbdddc64330a0d21a1f8e1830b7e2e8817f0e807cddc1b212e252ca5828908" Jan 10 17:00:03 crc kubenswrapper[5036]: E0110 17:00:03.595208 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cccbdddc64330a0d21a1f8e1830b7e2e8817f0e807cddc1b212e252ca5828908\": container with ID starting with cccbdddc64330a0d21a1f8e1830b7e2e8817f0e807cddc1b212e252ca5828908 not found: ID does not exist" containerID="cccbdddc64330a0d21a1f8e1830b7e2e8817f0e807cddc1b212e252ca5828908" Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.595254 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cccbdddc64330a0d21a1f8e1830b7e2e8817f0e807cddc1b212e252ca5828908"} err="failed to get container status \"cccbdddc64330a0d21a1f8e1830b7e2e8817f0e807cddc1b212e252ca5828908\": rpc error: code = NotFound desc = could not find container \"cccbdddc64330a0d21a1f8e1830b7e2e8817f0e807cddc1b212e252ca5828908\": container with ID starting with cccbdddc64330a0d21a1f8e1830b7e2e8817f0e807cddc1b212e252ca5828908 not found: ID does not exist" Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.595289 5036 scope.go:117] "RemoveContainer" containerID="3a7ea2ed0ccfa5a5e26fe11aff456f2eb2ed716bf9ead7c5fd09b3e06a0ef5b3" Jan 10 17:00:03 crc kubenswrapper[5036]: E0110 17:00:03.595509 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a7ea2ed0ccfa5a5e26fe11aff456f2eb2ed716bf9ead7c5fd09b3e06a0ef5b3\": container with ID starting with 3a7ea2ed0ccfa5a5e26fe11aff456f2eb2ed716bf9ead7c5fd09b3e06a0ef5b3 not found: ID does not exist" containerID="3a7ea2ed0ccfa5a5e26fe11aff456f2eb2ed716bf9ead7c5fd09b3e06a0ef5b3" Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.595530 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a7ea2ed0ccfa5a5e26fe11aff456f2eb2ed716bf9ead7c5fd09b3e06a0ef5b3"} err="failed to get container status \"3a7ea2ed0ccfa5a5e26fe11aff456f2eb2ed716bf9ead7c5fd09b3e06a0ef5b3\": rpc error: code = NotFound desc = could not find container \"3a7ea2ed0ccfa5a5e26fe11aff456f2eb2ed716bf9ead7c5fd09b3e06a0ef5b3\": container with ID starting with 3a7ea2ed0ccfa5a5e26fe11aff456f2eb2ed716bf9ead7c5fd09b3e06a0ef5b3 not found: ID does not exist" Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.595544 5036 scope.go:117] "RemoveContainer" containerID="41c91134f7085bc1b858b1832000824575af4fcd3b2a3207190e3d639d795f4b" Jan 10 17:00:03 crc kubenswrapper[5036]: E0110 17:00:03.595778 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41c91134f7085bc1b858b1832000824575af4fcd3b2a3207190e3d639d795f4b\": container with ID starting with 41c91134f7085bc1b858b1832000824575af4fcd3b2a3207190e3d639d795f4b not found: ID does not exist" containerID="41c91134f7085bc1b858b1832000824575af4fcd3b2a3207190e3d639d795f4b" Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.595800 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41c91134f7085bc1b858b1832000824575af4fcd3b2a3207190e3d639d795f4b"} err="failed to get container status \"41c91134f7085bc1b858b1832000824575af4fcd3b2a3207190e3d639d795f4b\": rpc error: code = NotFound desc = could not find container \"41c91134f7085bc1b858b1832000824575af4fcd3b2a3207190e3d639d795f4b\": container with ID starting with 41c91134f7085bc1b858b1832000824575af4fcd3b2a3207190e3d639d795f4b not found: ID does not exist" Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.974163 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29467695-kv4q7"] Jan 10 17:00:03 crc kubenswrapper[5036]: I0110 17:00:03.979585 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29467695-kv4q7"] Jan 10 17:00:04 crc kubenswrapper[5036]: I0110 17:00:04.519257 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ee18389-eb4f-4c7b-98bf-2f9785f21ce4" path="/var/lib/kubelet/pods/8ee18389-eb4f-4c7b-98bf-2f9785f21ce4/volumes" Jan 10 17:00:04 crc kubenswrapper[5036]: I0110 17:00:04.519846 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f52cd7a2-e44b-4e73-a390-3aafa94cb84f" path="/var/lib/kubelet/pods/f52cd7a2-e44b-4e73-a390-3aafa94cb84f/volumes" Jan 10 17:00:11 crc kubenswrapper[5036]: I0110 17:00:11.508905 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 17:00:11 crc kubenswrapper[5036]: E0110 17:00:11.510105 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:00:22 crc kubenswrapper[5036]: I0110 17:00:22.471361 5036 scope.go:117] "RemoveContainer" containerID="0c5c3301a4577408f54a0922a0a63167a433bf747ad0ad7bd9b5c05ff7b2b804" Jan 10 17:00:22 crc kubenswrapper[5036]: I0110 17:00:22.499224 5036 scope.go:117] "RemoveContainer" containerID="4c8534a5bf9b3e37f93bd5ed1c6a0f741332efeb5607e9f1b6c120b09a26db01" Jan 10 17:00:22 crc kubenswrapper[5036]: I0110 17:00:22.508193 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 17:00:22 crc kubenswrapper[5036]: E0110 17:00:22.508437 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:00:22 crc kubenswrapper[5036]: I0110 17:00:22.569083 5036 scope.go:117] "RemoveContainer" containerID="cdb712ec82a0d71a991537a44993a692f3d0fe5eb126b85791d1ad8bdcbd22da" Jan 10 17:00:22 crc kubenswrapper[5036]: I0110 17:00:22.598475 5036 scope.go:117] "RemoveContainer" containerID="cbead4e0ab9c072b5403183f12a0b05c59d2e5ce6af1916120449d5a32aff6cd" Jan 10 17:00:22 crc kubenswrapper[5036]: I0110 17:00:22.651117 5036 scope.go:117] "RemoveContainer" containerID="154ea3fa105911d8e7aa267185cf51e57c2a00885b21fae621da532db74454ea" Jan 10 17:00:22 crc kubenswrapper[5036]: I0110 17:00:22.722514 5036 scope.go:117] "RemoveContainer" containerID="eac7769a5f0268847dd1682479c36236728e90ac1aff3a319efee4508b56dbaa" Jan 10 17:00:22 crc kubenswrapper[5036]: I0110 17:00:22.769010 5036 scope.go:117] "RemoveContainer" containerID="8a9ffaa16b7c7a467f0ec1f47e215b5eda413f2801bbcc00c166760c00baa731" Jan 10 17:00:22 crc kubenswrapper[5036]: I0110 17:00:22.798798 5036 scope.go:117] "RemoveContainer" containerID="27662017b13517331e40f4940e69f8313a92f2b48d53a21db24568d97f34793a" Jan 10 17:00:22 crc kubenswrapper[5036]: I0110 17:00:22.816058 5036 scope.go:117] "RemoveContainer" containerID="f24e224e180ff8eb956a58588ad6c4458e422e67846dbcee935971b1749e185f" Jan 10 17:00:22 crc kubenswrapper[5036]: I0110 17:00:22.893148 5036 scope.go:117] "RemoveContainer" containerID="2f7034153490443e8c5c89aad5807a4745aecde5dd7d98519ae78de7cdc4c5b4" Jan 10 17:00:36 crc kubenswrapper[5036]: I0110 17:00:36.508576 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 17:00:36 crc kubenswrapper[5036]: E0110 17:00:36.509486 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:00:51 crc kubenswrapper[5036]: I0110 17:00:51.507784 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 17:00:51 crc kubenswrapper[5036]: E0110 17:00:51.508544 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:01:00 crc kubenswrapper[5036]: I0110 17:01:00.207209 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29467741-znhmh"] Jan 10 17:01:00 crc kubenswrapper[5036]: E0110 17:01:00.217744 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f52cd7a2-e44b-4e73-a390-3aafa94cb84f" containerName="extract-content" Jan 10 17:01:00 crc kubenswrapper[5036]: I0110 17:01:00.217786 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f52cd7a2-e44b-4e73-a390-3aafa94cb84f" containerName="extract-content" Jan 10 17:01:00 crc kubenswrapper[5036]: E0110 17:01:00.217810 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f52cd7a2-e44b-4e73-a390-3aafa94cb84f" containerName="extract-utilities" Jan 10 17:01:00 crc kubenswrapper[5036]: I0110 17:01:00.217822 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f52cd7a2-e44b-4e73-a390-3aafa94cb84f" containerName="extract-utilities" Jan 10 17:01:00 crc kubenswrapper[5036]: E0110 17:01:00.217938 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26cfb13d-e77a-4592-9668-34473b8379af" containerName="collect-profiles" Jan 10 17:01:00 crc kubenswrapper[5036]: I0110 17:01:00.217946 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="26cfb13d-e77a-4592-9668-34473b8379af" containerName="collect-profiles" Jan 10 17:01:00 crc kubenswrapper[5036]: E0110 17:01:00.217964 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f52cd7a2-e44b-4e73-a390-3aafa94cb84f" containerName="registry-server" Jan 10 17:01:00 crc kubenswrapper[5036]: I0110 17:01:00.217974 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f52cd7a2-e44b-4e73-a390-3aafa94cb84f" containerName="registry-server" Jan 10 17:01:00 crc kubenswrapper[5036]: I0110 17:01:00.218231 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="26cfb13d-e77a-4592-9668-34473b8379af" containerName="collect-profiles" Jan 10 17:01:00 crc kubenswrapper[5036]: I0110 17:01:00.218256 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="f52cd7a2-e44b-4e73-a390-3aafa94cb84f" containerName="registry-server" Jan 10 17:01:00 crc kubenswrapper[5036]: I0110 17:01:00.219881 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29467741-znhmh" Jan 10 17:01:00 crc kubenswrapper[5036]: I0110 17:01:00.236650 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29467741-znhmh"] Jan 10 17:01:00 crc kubenswrapper[5036]: I0110 17:01:00.301946 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bcb0a70-9f58-48f3-b35d-3adf490692cb-combined-ca-bundle\") pod \"keystone-cron-29467741-znhmh\" (UID: \"6bcb0a70-9f58-48f3-b35d-3adf490692cb\") " pod="openstack/keystone-cron-29467741-znhmh" Jan 10 17:01:00 crc kubenswrapper[5036]: I0110 17:01:00.302068 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp7lg\" (UniqueName: \"kubernetes.io/projected/6bcb0a70-9f58-48f3-b35d-3adf490692cb-kube-api-access-bp7lg\") pod \"keystone-cron-29467741-znhmh\" (UID: \"6bcb0a70-9f58-48f3-b35d-3adf490692cb\") " pod="openstack/keystone-cron-29467741-znhmh" Jan 10 17:01:00 crc kubenswrapper[5036]: I0110 17:01:00.302189 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6bcb0a70-9f58-48f3-b35d-3adf490692cb-fernet-keys\") pod \"keystone-cron-29467741-znhmh\" (UID: \"6bcb0a70-9f58-48f3-b35d-3adf490692cb\") " pod="openstack/keystone-cron-29467741-znhmh" Jan 10 17:01:00 crc kubenswrapper[5036]: I0110 17:01:00.302271 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bcb0a70-9f58-48f3-b35d-3adf490692cb-config-data\") pod \"keystone-cron-29467741-znhmh\" (UID: \"6bcb0a70-9f58-48f3-b35d-3adf490692cb\") " pod="openstack/keystone-cron-29467741-znhmh" Jan 10 17:01:00 crc kubenswrapper[5036]: I0110 17:01:00.404087 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6bcb0a70-9f58-48f3-b35d-3adf490692cb-fernet-keys\") pod \"keystone-cron-29467741-znhmh\" (UID: \"6bcb0a70-9f58-48f3-b35d-3adf490692cb\") " pod="openstack/keystone-cron-29467741-znhmh" Jan 10 17:01:00 crc kubenswrapper[5036]: I0110 17:01:00.404164 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bcb0a70-9f58-48f3-b35d-3adf490692cb-config-data\") pod \"keystone-cron-29467741-znhmh\" (UID: \"6bcb0a70-9f58-48f3-b35d-3adf490692cb\") " pod="openstack/keystone-cron-29467741-znhmh" Jan 10 17:01:00 crc kubenswrapper[5036]: I0110 17:01:00.404221 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bcb0a70-9f58-48f3-b35d-3adf490692cb-combined-ca-bundle\") pod \"keystone-cron-29467741-znhmh\" (UID: \"6bcb0a70-9f58-48f3-b35d-3adf490692cb\") " pod="openstack/keystone-cron-29467741-znhmh" Jan 10 17:01:00 crc kubenswrapper[5036]: I0110 17:01:00.404252 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp7lg\" (UniqueName: \"kubernetes.io/projected/6bcb0a70-9f58-48f3-b35d-3adf490692cb-kube-api-access-bp7lg\") pod \"keystone-cron-29467741-znhmh\" (UID: \"6bcb0a70-9f58-48f3-b35d-3adf490692cb\") " pod="openstack/keystone-cron-29467741-znhmh" Jan 10 17:01:00 crc kubenswrapper[5036]: I0110 17:01:00.410608 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6bcb0a70-9f58-48f3-b35d-3adf490692cb-fernet-keys\") pod \"keystone-cron-29467741-znhmh\" (UID: \"6bcb0a70-9f58-48f3-b35d-3adf490692cb\") " pod="openstack/keystone-cron-29467741-znhmh" Jan 10 17:01:00 crc kubenswrapper[5036]: I0110 17:01:00.411352 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bcb0a70-9f58-48f3-b35d-3adf490692cb-config-data\") pod \"keystone-cron-29467741-znhmh\" (UID: \"6bcb0a70-9f58-48f3-b35d-3adf490692cb\") " pod="openstack/keystone-cron-29467741-znhmh" Jan 10 17:01:00 crc kubenswrapper[5036]: I0110 17:01:00.411850 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bcb0a70-9f58-48f3-b35d-3adf490692cb-combined-ca-bundle\") pod \"keystone-cron-29467741-znhmh\" (UID: \"6bcb0a70-9f58-48f3-b35d-3adf490692cb\") " pod="openstack/keystone-cron-29467741-znhmh" Jan 10 17:01:00 crc kubenswrapper[5036]: I0110 17:01:00.419277 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp7lg\" (UniqueName: \"kubernetes.io/projected/6bcb0a70-9f58-48f3-b35d-3adf490692cb-kube-api-access-bp7lg\") pod \"keystone-cron-29467741-znhmh\" (UID: \"6bcb0a70-9f58-48f3-b35d-3adf490692cb\") " pod="openstack/keystone-cron-29467741-znhmh" Jan 10 17:01:00 crc kubenswrapper[5036]: I0110 17:01:00.603846 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29467741-znhmh" Jan 10 17:01:01 crc kubenswrapper[5036]: W0110 17:01:01.065906 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bcb0a70_9f58_48f3_b35d_3adf490692cb.slice/crio-b3190bc4de211fd3d6f1c926bd1c01fccfe1f58b65bd8fef66c1b66df14e39bd WatchSource:0}: Error finding container b3190bc4de211fd3d6f1c926bd1c01fccfe1f58b65bd8fef66c1b66df14e39bd: Status 404 returned error can't find the container with id b3190bc4de211fd3d6f1c926bd1c01fccfe1f58b65bd8fef66c1b66df14e39bd Jan 10 17:01:01 crc kubenswrapper[5036]: I0110 17:01:01.073323 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29467741-znhmh"] Jan 10 17:01:02 crc kubenswrapper[5036]: I0110 17:01:02.075291 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29467741-znhmh" event={"ID":"6bcb0a70-9f58-48f3-b35d-3adf490692cb","Type":"ContainerStarted","Data":"12111fd7e8dcd83df7d3df4291c30d46a2df6f7a1d6fd1912f8eb5317a418e60"} Jan 10 17:01:02 crc kubenswrapper[5036]: I0110 17:01:02.076114 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29467741-znhmh" event={"ID":"6bcb0a70-9f58-48f3-b35d-3adf490692cb","Type":"ContainerStarted","Data":"b3190bc4de211fd3d6f1c926bd1c01fccfe1f58b65bd8fef66c1b66df14e39bd"} Jan 10 17:01:02 crc kubenswrapper[5036]: I0110 17:01:02.098210 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29467741-znhmh" podStartSLOduration=2.0981911540000002 podStartE2EDuration="2.098191154s" podCreationTimestamp="2026-01-10 17:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 17:01:02.09804939 +0000 UTC m=+1983.968284894" watchObservedRunningTime="2026-01-10 17:01:02.098191154 +0000 UTC m=+1983.968426658" Jan 10 17:01:03 crc kubenswrapper[5036]: I0110 17:01:03.508427 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 17:01:04 crc kubenswrapper[5036]: I0110 17:01:04.096074 5036 generic.go:334] "Generic (PLEG): container finished" podID="6bcb0a70-9f58-48f3-b35d-3adf490692cb" containerID="12111fd7e8dcd83df7d3df4291c30d46a2df6f7a1d6fd1912f8eb5317a418e60" exitCode=0 Jan 10 17:01:04 crc kubenswrapper[5036]: I0110 17:01:04.096173 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29467741-znhmh" event={"ID":"6bcb0a70-9f58-48f3-b35d-3adf490692cb","Type":"ContainerDied","Data":"12111fd7e8dcd83df7d3df4291c30d46a2df6f7a1d6fd1912f8eb5317a418e60"} Jan 10 17:01:04 crc kubenswrapper[5036]: I0110 17:01:04.100870 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerStarted","Data":"9109549e278f48da54c19e23f5b37bdb271c9f61a90632945b7ebb3b8d6064d5"} Jan 10 17:01:05 crc kubenswrapper[5036]: I0110 17:01:05.444597 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29467741-znhmh" Jan 10 17:01:05 crc kubenswrapper[5036]: I0110 17:01:05.597415 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bcb0a70-9f58-48f3-b35d-3adf490692cb-combined-ca-bundle\") pod \"6bcb0a70-9f58-48f3-b35d-3adf490692cb\" (UID: \"6bcb0a70-9f58-48f3-b35d-3adf490692cb\") " Jan 10 17:01:05 crc kubenswrapper[5036]: I0110 17:01:05.597598 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bcb0a70-9f58-48f3-b35d-3adf490692cb-config-data\") pod \"6bcb0a70-9f58-48f3-b35d-3adf490692cb\" (UID: \"6bcb0a70-9f58-48f3-b35d-3adf490692cb\") " Jan 10 17:01:05 crc kubenswrapper[5036]: I0110 17:01:05.597672 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp7lg\" (UniqueName: \"kubernetes.io/projected/6bcb0a70-9f58-48f3-b35d-3adf490692cb-kube-api-access-bp7lg\") pod \"6bcb0a70-9f58-48f3-b35d-3adf490692cb\" (UID: \"6bcb0a70-9f58-48f3-b35d-3adf490692cb\") " Jan 10 17:01:05 crc kubenswrapper[5036]: I0110 17:01:05.597716 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6bcb0a70-9f58-48f3-b35d-3adf490692cb-fernet-keys\") pod \"6bcb0a70-9f58-48f3-b35d-3adf490692cb\" (UID: \"6bcb0a70-9f58-48f3-b35d-3adf490692cb\") " Jan 10 17:01:05 crc kubenswrapper[5036]: I0110 17:01:05.604891 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bcb0a70-9f58-48f3-b35d-3adf490692cb-kube-api-access-bp7lg" (OuterVolumeSpecName: "kube-api-access-bp7lg") pod "6bcb0a70-9f58-48f3-b35d-3adf490692cb" (UID: "6bcb0a70-9f58-48f3-b35d-3adf490692cb"). InnerVolumeSpecName "kube-api-access-bp7lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:01:05 crc kubenswrapper[5036]: I0110 17:01:05.612597 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bcb0a70-9f58-48f3-b35d-3adf490692cb-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6bcb0a70-9f58-48f3-b35d-3adf490692cb" (UID: "6bcb0a70-9f58-48f3-b35d-3adf490692cb"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:01:05 crc kubenswrapper[5036]: I0110 17:01:05.646871 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bcb0a70-9f58-48f3-b35d-3adf490692cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6bcb0a70-9f58-48f3-b35d-3adf490692cb" (UID: "6bcb0a70-9f58-48f3-b35d-3adf490692cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:01:05 crc kubenswrapper[5036]: I0110 17:01:05.661884 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bcb0a70-9f58-48f3-b35d-3adf490692cb-config-data" (OuterVolumeSpecName: "config-data") pod "6bcb0a70-9f58-48f3-b35d-3adf490692cb" (UID: "6bcb0a70-9f58-48f3-b35d-3adf490692cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:01:05 crc kubenswrapper[5036]: I0110 17:01:05.702468 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bcb0a70-9f58-48f3-b35d-3adf490692cb-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 17:01:05 crc kubenswrapper[5036]: I0110 17:01:05.702513 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp7lg\" (UniqueName: \"kubernetes.io/projected/6bcb0a70-9f58-48f3-b35d-3adf490692cb-kube-api-access-bp7lg\") on node \"crc\" DevicePath \"\"" Jan 10 17:01:05 crc kubenswrapper[5036]: I0110 17:01:05.702533 5036 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6bcb0a70-9f58-48f3-b35d-3adf490692cb-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 10 17:01:05 crc kubenswrapper[5036]: I0110 17:01:05.702549 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bcb0a70-9f58-48f3-b35d-3adf490692cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 17:01:06 crc kubenswrapper[5036]: I0110 17:01:06.123262 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29467741-znhmh" event={"ID":"6bcb0a70-9f58-48f3-b35d-3adf490692cb","Type":"ContainerDied","Data":"b3190bc4de211fd3d6f1c926bd1c01fccfe1f58b65bd8fef66c1b66df14e39bd"} Jan 10 17:01:06 crc kubenswrapper[5036]: I0110 17:01:06.123619 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3190bc4de211fd3d6f1c926bd1c01fccfe1f58b65bd8fef66c1b66df14e39bd" Jan 10 17:01:06 crc kubenswrapper[5036]: I0110 17:01:06.123360 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29467741-znhmh" Jan 10 17:01:23 crc kubenswrapper[5036]: I0110 17:01:23.044380 5036 scope.go:117] "RemoveContainer" containerID="67f3b7fd24ff2a03470643cb4651ff7f9c5e5867383284789883ee79833bcec7" Jan 10 17:01:23 crc kubenswrapper[5036]: I0110 17:01:23.081621 5036 scope.go:117] "RemoveContainer" containerID="955496c32694a10b2d348bd4a8f1c73c530e4f25b291066855c51fdccd8fc08f" Jan 10 17:01:46 crc kubenswrapper[5036]: I0110 17:01:46.469093 5036 generic.go:334] "Generic (PLEG): container finished" podID="d9f0ccdb-1434-4bd0-90e1-d9314c8d716f" containerID="0eac4af577bae30bf5ce961c532cba0ebae91cd83e673f3d5fbb527b7de45059" exitCode=0 Jan 10 17:01:46 crc kubenswrapper[5036]: I0110 17:01:46.469186 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz" event={"ID":"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f","Type":"ContainerDied","Data":"0eac4af577bae30bf5ce961c532cba0ebae91cd83e673f3d5fbb527b7de45059"} Jan 10 17:01:47 crc kubenswrapper[5036]: I0110 17:01:47.893002 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.004943 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqf6p\" (UniqueName: \"kubernetes.io/projected/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-kube-api-access-dqf6p\") pod \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\" (UID: \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\") " Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.005217 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-bootstrap-combined-ca-bundle\") pod \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\" (UID: \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\") " Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.005265 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-ceph\") pod \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\" (UID: \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\") " Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.005295 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-inventory\") pod \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\" (UID: \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\") " Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.005321 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-ssh-key-openstack-edpm-ipam\") pod \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\" (UID: \"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f\") " Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.011844 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "d9f0ccdb-1434-4bd0-90e1-d9314c8d716f" (UID: "d9f0ccdb-1434-4bd0-90e1-d9314c8d716f"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.011908 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-ceph" (OuterVolumeSpecName: "ceph") pod "d9f0ccdb-1434-4bd0-90e1-d9314c8d716f" (UID: "d9f0ccdb-1434-4bd0-90e1-d9314c8d716f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.012213 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-kube-api-access-dqf6p" (OuterVolumeSpecName: "kube-api-access-dqf6p") pod "d9f0ccdb-1434-4bd0-90e1-d9314c8d716f" (UID: "d9f0ccdb-1434-4bd0-90e1-d9314c8d716f"). InnerVolumeSpecName "kube-api-access-dqf6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.031090 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-inventory" (OuterVolumeSpecName: "inventory") pod "d9f0ccdb-1434-4bd0-90e1-d9314c8d716f" (UID: "d9f0ccdb-1434-4bd0-90e1-d9314c8d716f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.033428 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d9f0ccdb-1434-4bd0-90e1-d9314c8d716f" (UID: "d9f0ccdb-1434-4bd0-90e1-d9314c8d716f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.107295 5036 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.107375 5036 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-ceph\") on node \"crc\" DevicePath \"\"" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.107387 5036 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-inventory\") on node \"crc\" DevicePath \"\"" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.107397 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.107408 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqf6p\" (UniqueName: \"kubernetes.io/projected/d9f0ccdb-1434-4bd0-90e1-d9314c8d716f-kube-api-access-dqf6p\") on node \"crc\" DevicePath \"\"" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.496662 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz" event={"ID":"d9f0ccdb-1434-4bd0-90e1-d9314c8d716f","Type":"ContainerDied","Data":"d08445a8c9308c3bc1f6ff1b2fd683f030e5d63a933c19a2e3ca1c90538ad178"} Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.497041 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d08445a8c9308c3bc1f6ff1b2fd683f030e5d63a933c19a2e3ca1c90538ad178" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.497256 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.596421 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4"] Jan 10 17:01:48 crc kubenswrapper[5036]: E0110 17:01:48.596814 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f0ccdb-1434-4bd0-90e1-d9314c8d716f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.596832 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f0ccdb-1434-4bd0-90e1-d9314c8d716f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 10 17:01:48 crc kubenswrapper[5036]: E0110 17:01:48.596855 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bcb0a70-9f58-48f3-b35d-3adf490692cb" containerName="keystone-cron" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.596861 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bcb0a70-9f58-48f3-b35d-3adf490692cb" containerName="keystone-cron" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.597020 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bcb0a70-9f58-48f3-b35d-3adf490692cb" containerName="keystone-cron" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.597043 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f0ccdb-1434-4bd0-90e1-d9314c8d716f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.597613 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.602513 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.602699 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.602761 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.602911 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.602921 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.603362 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4"] Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.617430 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/feaba290-606b-4396-af62-f32fd6e33a53-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4\" (UID: \"feaba290-606b-4396-af62-f32fd6e33a53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.617475 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ndbx\" (UniqueName: \"kubernetes.io/projected/feaba290-606b-4396-af62-f32fd6e33a53-kube-api-access-6ndbx\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4\" (UID: \"feaba290-606b-4396-af62-f32fd6e33a53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.617503 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/feaba290-606b-4396-af62-f32fd6e33a53-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4\" (UID: \"feaba290-606b-4396-af62-f32fd6e33a53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.617622 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/feaba290-606b-4396-af62-f32fd6e33a53-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4\" (UID: \"feaba290-606b-4396-af62-f32fd6e33a53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.719249 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/feaba290-606b-4396-af62-f32fd6e33a53-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4\" (UID: \"feaba290-606b-4396-af62-f32fd6e33a53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.719356 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ndbx\" (UniqueName: \"kubernetes.io/projected/feaba290-606b-4396-af62-f32fd6e33a53-kube-api-access-6ndbx\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4\" (UID: \"feaba290-606b-4396-af62-f32fd6e33a53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.719394 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/feaba290-606b-4396-af62-f32fd6e33a53-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4\" (UID: \"feaba290-606b-4396-af62-f32fd6e33a53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.719520 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/feaba290-606b-4396-af62-f32fd6e33a53-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4\" (UID: \"feaba290-606b-4396-af62-f32fd6e33a53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.725450 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/feaba290-606b-4396-af62-f32fd6e33a53-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4\" (UID: \"feaba290-606b-4396-af62-f32fd6e33a53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.726451 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/feaba290-606b-4396-af62-f32fd6e33a53-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4\" (UID: \"feaba290-606b-4396-af62-f32fd6e33a53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.727188 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/feaba290-606b-4396-af62-f32fd6e33a53-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4\" (UID: \"feaba290-606b-4396-af62-f32fd6e33a53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.739778 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ndbx\" (UniqueName: \"kubernetes.io/projected/feaba290-606b-4396-af62-f32fd6e33a53-kube-api-access-6ndbx\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4\" (UID: \"feaba290-606b-4396-af62-f32fd6e33a53\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4" Jan 10 17:01:48 crc kubenswrapper[5036]: I0110 17:01:48.912998 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4" Jan 10 17:01:49 crc kubenswrapper[5036]: I0110 17:01:49.415560 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4"] Jan 10 17:01:49 crc kubenswrapper[5036]: I0110 17:01:49.504961 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4" event={"ID":"feaba290-606b-4396-af62-f32fd6e33a53","Type":"ContainerStarted","Data":"bfd9c521c73406746ad00717c12bfeceb069b32941b1a9d78fb78d3d121e5595"} Jan 10 17:01:50 crc kubenswrapper[5036]: I0110 17:01:50.523527 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4" event={"ID":"feaba290-606b-4396-af62-f32fd6e33a53","Type":"ContainerStarted","Data":"d0171dade94a94797ea18fb8b57b390d5945b0fd58aed429c0f1adde31f90f23"} Jan 10 17:01:50 crc kubenswrapper[5036]: I0110 17:01:50.540469 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4" podStartSLOduration=1.971652274 podStartE2EDuration="2.540450521s" podCreationTimestamp="2026-01-10 17:01:48 +0000 UTC" firstStartedPulling="2026-01-10 17:01:49.424557368 +0000 UTC m=+2031.294792872" lastFinishedPulling="2026-01-10 17:01:49.993355605 +0000 UTC m=+2031.863591119" observedRunningTime="2026-01-10 17:01:50.532811522 +0000 UTC m=+2032.403047026" watchObservedRunningTime="2026-01-10 17:01:50.540450521 +0000 UTC m=+2032.410686005" Jan 10 17:02:15 crc kubenswrapper[5036]: I0110 17:02:15.715325 5036 generic.go:334] "Generic (PLEG): container finished" podID="feaba290-606b-4396-af62-f32fd6e33a53" containerID="d0171dade94a94797ea18fb8b57b390d5945b0fd58aed429c0f1adde31f90f23" exitCode=0 Jan 10 17:02:15 crc kubenswrapper[5036]: I0110 17:02:15.715411 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4" event={"ID":"feaba290-606b-4396-af62-f32fd6e33a53","Type":"ContainerDied","Data":"d0171dade94a94797ea18fb8b57b390d5945b0fd58aed429c0f1adde31f90f23"} Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.282762 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4" Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.283754 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ndbx\" (UniqueName: \"kubernetes.io/projected/feaba290-606b-4396-af62-f32fd6e33a53-kube-api-access-6ndbx\") pod \"feaba290-606b-4396-af62-f32fd6e33a53\" (UID: \"feaba290-606b-4396-af62-f32fd6e33a53\") " Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.283801 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/feaba290-606b-4396-af62-f32fd6e33a53-ssh-key-openstack-edpm-ipam\") pod \"feaba290-606b-4396-af62-f32fd6e33a53\" (UID: \"feaba290-606b-4396-af62-f32fd6e33a53\") " Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.283855 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/feaba290-606b-4396-af62-f32fd6e33a53-ceph\") pod \"feaba290-606b-4396-af62-f32fd6e33a53\" (UID: \"feaba290-606b-4396-af62-f32fd6e33a53\") " Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.289496 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feaba290-606b-4396-af62-f32fd6e33a53-ceph" (OuterVolumeSpecName: "ceph") pod "feaba290-606b-4396-af62-f32fd6e33a53" (UID: "feaba290-606b-4396-af62-f32fd6e33a53"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.291004 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feaba290-606b-4396-af62-f32fd6e33a53-kube-api-access-6ndbx" (OuterVolumeSpecName: "kube-api-access-6ndbx") pod "feaba290-606b-4396-af62-f32fd6e33a53" (UID: "feaba290-606b-4396-af62-f32fd6e33a53"). InnerVolumeSpecName "kube-api-access-6ndbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.331910 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feaba290-606b-4396-af62-f32fd6e33a53-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "feaba290-606b-4396-af62-f32fd6e33a53" (UID: "feaba290-606b-4396-af62-f32fd6e33a53"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.385794 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ndbx\" (UniqueName: \"kubernetes.io/projected/feaba290-606b-4396-af62-f32fd6e33a53-kube-api-access-6ndbx\") on node \"crc\" DevicePath \"\"" Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.385846 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/feaba290-606b-4396-af62-f32fd6e33a53-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.385859 5036 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/feaba290-606b-4396-af62-f32fd6e33a53-ceph\") on node \"crc\" DevicePath \"\"" Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.487207 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/feaba290-606b-4396-af62-f32fd6e33a53-inventory\") pod \"feaba290-606b-4396-af62-f32fd6e33a53\" (UID: \"feaba290-606b-4396-af62-f32fd6e33a53\") " Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.509718 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feaba290-606b-4396-af62-f32fd6e33a53-inventory" (OuterVolumeSpecName: "inventory") pod "feaba290-606b-4396-af62-f32fd6e33a53" (UID: "feaba290-606b-4396-af62-f32fd6e33a53"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.588667 5036 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/feaba290-606b-4396-af62-f32fd6e33a53-inventory\") on node \"crc\" DevicePath \"\"" Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.736062 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4" event={"ID":"feaba290-606b-4396-af62-f32fd6e33a53","Type":"ContainerDied","Data":"bfd9c521c73406746ad00717c12bfeceb069b32941b1a9d78fb78d3d121e5595"} Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.736106 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfd9c521c73406746ad00717c12bfeceb069b32941b1a9d78fb78d3d121e5595" Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.736151 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4" Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.857871 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg"] Jan 10 17:02:17 crc kubenswrapper[5036]: E0110 17:02:17.858369 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feaba290-606b-4396-af62-f32fd6e33a53" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.858396 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="feaba290-606b-4396-af62-f32fd6e33a53" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.858713 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="feaba290-606b-4396-af62-f32fd6e33a53" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.859630 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg" Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.864478 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.864772 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.864870 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.865092 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.868101 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.870180 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg"] Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.903700 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be9c4cc3-5744-42de-809a-fcd16a407199-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg\" (UID: \"be9c4cc3-5744-42de-809a-fcd16a407199\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg" Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.904021 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmv4n\" (UniqueName: \"kubernetes.io/projected/be9c4cc3-5744-42de-809a-fcd16a407199-kube-api-access-dmv4n\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg\" (UID: \"be9c4cc3-5744-42de-809a-fcd16a407199\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg" Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.904104 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be9c4cc3-5744-42de-809a-fcd16a407199-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg\" (UID: \"be9c4cc3-5744-42de-809a-fcd16a407199\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg" Jan 10 17:02:17 crc kubenswrapper[5036]: I0110 17:02:17.904152 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/be9c4cc3-5744-42de-809a-fcd16a407199-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg\" (UID: \"be9c4cc3-5744-42de-809a-fcd16a407199\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg" Jan 10 17:02:18 crc kubenswrapper[5036]: I0110 17:02:18.005366 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmv4n\" (UniqueName: \"kubernetes.io/projected/be9c4cc3-5744-42de-809a-fcd16a407199-kube-api-access-dmv4n\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg\" (UID: \"be9c4cc3-5744-42de-809a-fcd16a407199\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg" Jan 10 17:02:18 crc kubenswrapper[5036]: I0110 17:02:18.005766 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be9c4cc3-5744-42de-809a-fcd16a407199-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg\" (UID: \"be9c4cc3-5744-42de-809a-fcd16a407199\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg" Jan 10 17:02:18 crc kubenswrapper[5036]: I0110 17:02:18.005793 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/be9c4cc3-5744-42de-809a-fcd16a407199-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg\" (UID: \"be9c4cc3-5744-42de-809a-fcd16a407199\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg" Jan 10 17:02:18 crc kubenswrapper[5036]: I0110 17:02:18.005830 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be9c4cc3-5744-42de-809a-fcd16a407199-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg\" (UID: \"be9c4cc3-5744-42de-809a-fcd16a407199\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg" Jan 10 17:02:18 crc kubenswrapper[5036]: I0110 17:02:18.010514 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/be9c4cc3-5744-42de-809a-fcd16a407199-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg\" (UID: \"be9c4cc3-5744-42de-809a-fcd16a407199\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg" Jan 10 17:02:18 crc kubenswrapper[5036]: I0110 17:02:18.010671 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be9c4cc3-5744-42de-809a-fcd16a407199-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg\" (UID: \"be9c4cc3-5744-42de-809a-fcd16a407199\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg" Jan 10 17:02:18 crc kubenswrapper[5036]: I0110 17:02:18.011200 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be9c4cc3-5744-42de-809a-fcd16a407199-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg\" (UID: \"be9c4cc3-5744-42de-809a-fcd16a407199\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg" Jan 10 17:02:18 crc kubenswrapper[5036]: I0110 17:02:18.020086 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmv4n\" (UniqueName: \"kubernetes.io/projected/be9c4cc3-5744-42de-809a-fcd16a407199-kube-api-access-dmv4n\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg\" (UID: \"be9c4cc3-5744-42de-809a-fcd16a407199\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg" Jan 10 17:02:18 crc kubenswrapper[5036]: I0110 17:02:18.185789 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg" Jan 10 17:02:18 crc kubenswrapper[5036]: I0110 17:02:18.695522 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg"] Jan 10 17:02:18 crc kubenswrapper[5036]: I0110 17:02:18.747236 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg" event={"ID":"be9c4cc3-5744-42de-809a-fcd16a407199","Type":"ContainerStarted","Data":"f661d8beeb1b60c4029ccb26ad82779f7d3294e06d69e9523c91f147f3540e00"} Jan 10 17:02:19 crc kubenswrapper[5036]: I0110 17:02:19.756211 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg" event={"ID":"be9c4cc3-5744-42de-809a-fcd16a407199","Type":"ContainerStarted","Data":"1e25811fd0c665844340adfdd649d280bf690c6fb72a534dd47391529185f30d"} Jan 10 17:02:19 crc kubenswrapper[5036]: I0110 17:02:19.786570 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg" podStartSLOduration=2.254732903 podStartE2EDuration="2.786548302s" podCreationTimestamp="2026-01-10 17:02:17 +0000 UTC" firstStartedPulling="2026-01-10 17:02:18.702608004 +0000 UTC m=+2060.572843488" lastFinishedPulling="2026-01-10 17:02:19.234423373 +0000 UTC m=+2061.104658887" observedRunningTime="2026-01-10 17:02:19.775893027 +0000 UTC m=+2061.646128531" watchObservedRunningTime="2026-01-10 17:02:19.786548302 +0000 UTC m=+2061.656783816" Jan 10 17:02:23 crc kubenswrapper[5036]: I0110 17:02:23.224705 5036 scope.go:117] "RemoveContainer" containerID="4fa98bca3cb058e28dc9fe21a78adab57b00c56666815e3d0fd83a5125692672" Jan 10 17:02:23 crc kubenswrapper[5036]: I0110 17:02:23.276807 5036 scope.go:117] "RemoveContainer" containerID="e78f8734699e9ba36030a91cb0b112add8a3c8ac6246e0f6dad2df7cea8ce553" Jan 10 17:02:24 crc kubenswrapper[5036]: I0110 17:02:24.795482 5036 generic.go:334] "Generic (PLEG): container finished" podID="be9c4cc3-5744-42de-809a-fcd16a407199" containerID="1e25811fd0c665844340adfdd649d280bf690c6fb72a534dd47391529185f30d" exitCode=0 Jan 10 17:02:24 crc kubenswrapper[5036]: I0110 17:02:24.795526 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg" event={"ID":"be9c4cc3-5744-42de-809a-fcd16a407199","Type":"ContainerDied","Data":"1e25811fd0c665844340adfdd649d280bf690c6fb72a534dd47391529185f30d"} Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.317589 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg" Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.383810 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be9c4cc3-5744-42de-809a-fcd16a407199-inventory\") pod \"be9c4cc3-5744-42de-809a-fcd16a407199\" (UID: \"be9c4cc3-5744-42de-809a-fcd16a407199\") " Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.384176 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be9c4cc3-5744-42de-809a-fcd16a407199-ssh-key-openstack-edpm-ipam\") pod \"be9c4cc3-5744-42de-809a-fcd16a407199\" (UID: \"be9c4cc3-5744-42de-809a-fcd16a407199\") " Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.384226 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmv4n\" (UniqueName: \"kubernetes.io/projected/be9c4cc3-5744-42de-809a-fcd16a407199-kube-api-access-dmv4n\") pod \"be9c4cc3-5744-42de-809a-fcd16a407199\" (UID: \"be9c4cc3-5744-42de-809a-fcd16a407199\") " Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.384269 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/be9c4cc3-5744-42de-809a-fcd16a407199-ceph\") pod \"be9c4cc3-5744-42de-809a-fcd16a407199\" (UID: \"be9c4cc3-5744-42de-809a-fcd16a407199\") " Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.391034 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be9c4cc3-5744-42de-809a-fcd16a407199-kube-api-access-dmv4n" (OuterVolumeSpecName: "kube-api-access-dmv4n") pod "be9c4cc3-5744-42de-809a-fcd16a407199" (UID: "be9c4cc3-5744-42de-809a-fcd16a407199"). InnerVolumeSpecName "kube-api-access-dmv4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.391858 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be9c4cc3-5744-42de-809a-fcd16a407199-ceph" (OuterVolumeSpecName: "ceph") pod "be9c4cc3-5744-42de-809a-fcd16a407199" (UID: "be9c4cc3-5744-42de-809a-fcd16a407199"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.407824 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be9c4cc3-5744-42de-809a-fcd16a407199-inventory" (OuterVolumeSpecName: "inventory") pod "be9c4cc3-5744-42de-809a-fcd16a407199" (UID: "be9c4cc3-5744-42de-809a-fcd16a407199"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.415229 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be9c4cc3-5744-42de-809a-fcd16a407199-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "be9c4cc3-5744-42de-809a-fcd16a407199" (UID: "be9c4cc3-5744-42de-809a-fcd16a407199"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.486417 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be9c4cc3-5744-42de-809a-fcd16a407199-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.486466 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmv4n\" (UniqueName: \"kubernetes.io/projected/be9c4cc3-5744-42de-809a-fcd16a407199-kube-api-access-dmv4n\") on node \"crc\" DevicePath \"\"" Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.486480 5036 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/be9c4cc3-5744-42de-809a-fcd16a407199-ceph\") on node \"crc\" DevicePath \"\"" Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.486493 5036 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be9c4cc3-5744-42de-809a-fcd16a407199-inventory\") on node \"crc\" DevicePath \"\"" Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.873593 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg" event={"ID":"be9c4cc3-5744-42de-809a-fcd16a407199","Type":"ContainerDied","Data":"f661d8beeb1b60c4029ccb26ad82779f7d3294e06d69e9523c91f147f3540e00"} Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.873655 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f661d8beeb1b60c4029ccb26ad82779f7d3294e06d69e9523c91f147f3540e00" Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.873666 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg" Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.926326 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-zzhbf"] Jan 10 17:02:26 crc kubenswrapper[5036]: E0110 17:02:26.926786 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be9c4cc3-5744-42de-809a-fcd16a407199" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.926827 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="be9c4cc3-5744-42de-809a-fcd16a407199" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.927101 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="be9c4cc3-5744-42de-809a-fcd16a407199" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.927967 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zzhbf" Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.938890 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.948694 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-zzhbf"] Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.948869 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.949540 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.949542 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.949547 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.996537 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7be91e0f-1820-445f-b106-0558e046ac4a-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zzhbf\" (UID: \"7be91e0f-1820-445f-b106-0558e046ac4a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zzhbf" Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.996588 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7be91e0f-1820-445f-b106-0558e046ac4a-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zzhbf\" (UID: \"7be91e0f-1820-445f-b106-0558e046ac4a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zzhbf" Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.996696 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwzrn\" (UniqueName: \"kubernetes.io/projected/7be91e0f-1820-445f-b106-0558e046ac4a-kube-api-access-jwzrn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zzhbf\" (UID: \"7be91e0f-1820-445f-b106-0558e046ac4a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zzhbf" Jan 10 17:02:26 crc kubenswrapper[5036]: I0110 17:02:26.996846 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7be91e0f-1820-445f-b106-0558e046ac4a-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zzhbf\" (UID: \"7be91e0f-1820-445f-b106-0558e046ac4a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zzhbf" Jan 10 17:02:27 crc kubenswrapper[5036]: I0110 17:02:27.098888 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwzrn\" (UniqueName: \"kubernetes.io/projected/7be91e0f-1820-445f-b106-0558e046ac4a-kube-api-access-jwzrn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zzhbf\" (UID: \"7be91e0f-1820-445f-b106-0558e046ac4a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zzhbf" Jan 10 17:02:27 crc kubenswrapper[5036]: I0110 17:02:27.098955 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7be91e0f-1820-445f-b106-0558e046ac4a-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zzhbf\" (UID: \"7be91e0f-1820-445f-b106-0558e046ac4a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zzhbf" Jan 10 17:02:27 crc kubenswrapper[5036]: I0110 17:02:27.099117 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7be91e0f-1820-445f-b106-0558e046ac4a-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zzhbf\" (UID: \"7be91e0f-1820-445f-b106-0558e046ac4a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zzhbf" Jan 10 17:02:27 crc kubenswrapper[5036]: I0110 17:02:27.099156 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7be91e0f-1820-445f-b106-0558e046ac4a-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zzhbf\" (UID: \"7be91e0f-1820-445f-b106-0558e046ac4a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zzhbf" Jan 10 17:02:27 crc kubenswrapper[5036]: I0110 17:02:27.103893 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7be91e0f-1820-445f-b106-0558e046ac4a-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zzhbf\" (UID: \"7be91e0f-1820-445f-b106-0558e046ac4a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zzhbf" Jan 10 17:02:27 crc kubenswrapper[5036]: I0110 17:02:27.104507 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7be91e0f-1820-445f-b106-0558e046ac4a-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zzhbf\" (UID: \"7be91e0f-1820-445f-b106-0558e046ac4a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zzhbf" Jan 10 17:02:27 crc kubenswrapper[5036]: I0110 17:02:27.106377 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7be91e0f-1820-445f-b106-0558e046ac4a-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zzhbf\" (UID: \"7be91e0f-1820-445f-b106-0558e046ac4a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zzhbf" Jan 10 17:02:27 crc kubenswrapper[5036]: I0110 17:02:27.116385 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwzrn\" (UniqueName: \"kubernetes.io/projected/7be91e0f-1820-445f-b106-0558e046ac4a-kube-api-access-jwzrn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zzhbf\" (UID: \"7be91e0f-1820-445f-b106-0558e046ac4a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zzhbf" Jan 10 17:02:27 crc kubenswrapper[5036]: I0110 17:02:27.263526 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zzhbf" Jan 10 17:02:27 crc kubenswrapper[5036]: I0110 17:02:27.824712 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-zzhbf"] Jan 10 17:02:27 crc kubenswrapper[5036]: I0110 17:02:27.882905 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zzhbf" event={"ID":"7be91e0f-1820-445f-b106-0558e046ac4a","Type":"ContainerStarted","Data":"6c3d55ed79ce418e250b8b133da4d7c55cf83175a157e5caa80f10777e22e965"} Jan 10 17:02:28 crc kubenswrapper[5036]: I0110 17:02:28.890386 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zzhbf" event={"ID":"7be91e0f-1820-445f-b106-0558e046ac4a","Type":"ContainerStarted","Data":"3e6f3630f3408ab500fa94660d798dbd2d03e9bfcafea4c72c001db78cf65aa5"} Jan 10 17:02:28 crc kubenswrapper[5036]: I0110 17:02:28.912157 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zzhbf" podStartSLOduration=2.4573992430000002 podStartE2EDuration="2.912141663s" podCreationTimestamp="2026-01-10 17:02:26 +0000 UTC" firstStartedPulling="2026-01-10 17:02:27.827154396 +0000 UTC m=+2069.697389890" lastFinishedPulling="2026-01-10 17:02:28.281896816 +0000 UTC m=+2070.152132310" observedRunningTime="2026-01-10 17:02:28.910973129 +0000 UTC m=+2070.781208653" watchObservedRunningTime="2026-01-10 17:02:28.912141663 +0000 UTC m=+2070.782377157" Jan 10 17:03:07 crc kubenswrapper[5036]: I0110 17:03:07.247117 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zzhbf" event={"ID":"7be91e0f-1820-445f-b106-0558e046ac4a","Type":"ContainerDied","Data":"3e6f3630f3408ab500fa94660d798dbd2d03e9bfcafea4c72c001db78cf65aa5"} Jan 10 17:03:07 crc kubenswrapper[5036]: I0110 17:03:07.247161 5036 generic.go:334] "Generic (PLEG): container finished" podID="7be91e0f-1820-445f-b106-0558e046ac4a" containerID="3e6f3630f3408ab500fa94660d798dbd2d03e9bfcafea4c72c001db78cf65aa5" exitCode=0 Jan 10 17:03:08 crc kubenswrapper[5036]: I0110 17:03:08.683157 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zzhbf" Jan 10 17:03:08 crc kubenswrapper[5036]: I0110 17:03:08.792235 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7be91e0f-1820-445f-b106-0558e046ac4a-inventory\") pod \"7be91e0f-1820-445f-b106-0558e046ac4a\" (UID: \"7be91e0f-1820-445f-b106-0558e046ac4a\") " Jan 10 17:03:08 crc kubenswrapper[5036]: I0110 17:03:08.792538 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7be91e0f-1820-445f-b106-0558e046ac4a-ceph\") pod \"7be91e0f-1820-445f-b106-0558e046ac4a\" (UID: \"7be91e0f-1820-445f-b106-0558e046ac4a\") " Jan 10 17:03:08 crc kubenswrapper[5036]: I0110 17:03:08.792609 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7be91e0f-1820-445f-b106-0558e046ac4a-ssh-key-openstack-edpm-ipam\") pod \"7be91e0f-1820-445f-b106-0558e046ac4a\" (UID: \"7be91e0f-1820-445f-b106-0558e046ac4a\") " Jan 10 17:03:08 crc kubenswrapper[5036]: I0110 17:03:08.792796 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwzrn\" (UniqueName: \"kubernetes.io/projected/7be91e0f-1820-445f-b106-0558e046ac4a-kube-api-access-jwzrn\") pod \"7be91e0f-1820-445f-b106-0558e046ac4a\" (UID: \"7be91e0f-1820-445f-b106-0558e046ac4a\") " Jan 10 17:03:08 crc kubenswrapper[5036]: I0110 17:03:08.798880 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7be91e0f-1820-445f-b106-0558e046ac4a-ceph" (OuterVolumeSpecName: "ceph") pod "7be91e0f-1820-445f-b106-0558e046ac4a" (UID: "7be91e0f-1820-445f-b106-0558e046ac4a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:03:08 crc kubenswrapper[5036]: I0110 17:03:08.798943 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7be91e0f-1820-445f-b106-0558e046ac4a-kube-api-access-jwzrn" (OuterVolumeSpecName: "kube-api-access-jwzrn") pod "7be91e0f-1820-445f-b106-0558e046ac4a" (UID: "7be91e0f-1820-445f-b106-0558e046ac4a"). InnerVolumeSpecName "kube-api-access-jwzrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:03:08 crc kubenswrapper[5036]: I0110 17:03:08.817235 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7be91e0f-1820-445f-b106-0558e046ac4a-inventory" (OuterVolumeSpecName: "inventory") pod "7be91e0f-1820-445f-b106-0558e046ac4a" (UID: "7be91e0f-1820-445f-b106-0558e046ac4a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:03:08 crc kubenswrapper[5036]: I0110 17:03:08.823112 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7be91e0f-1820-445f-b106-0558e046ac4a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7be91e0f-1820-445f-b106-0558e046ac4a" (UID: "7be91e0f-1820-445f-b106-0558e046ac4a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:03:08 crc kubenswrapper[5036]: I0110 17:03:08.895607 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7be91e0f-1820-445f-b106-0558e046ac4a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 17:03:08 crc kubenswrapper[5036]: I0110 17:03:08.895707 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwzrn\" (UniqueName: \"kubernetes.io/projected/7be91e0f-1820-445f-b106-0558e046ac4a-kube-api-access-jwzrn\") on node \"crc\" DevicePath \"\"" Jan 10 17:03:08 crc kubenswrapper[5036]: I0110 17:03:08.895741 5036 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7be91e0f-1820-445f-b106-0558e046ac4a-inventory\") on node \"crc\" DevicePath \"\"" Jan 10 17:03:08 crc kubenswrapper[5036]: I0110 17:03:08.895767 5036 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7be91e0f-1820-445f-b106-0558e046ac4a-ceph\") on node \"crc\" DevicePath \"\"" Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.268319 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zzhbf" event={"ID":"7be91e0f-1820-445f-b106-0558e046ac4a","Type":"ContainerDied","Data":"6c3d55ed79ce418e250b8b133da4d7c55cf83175a157e5caa80f10777e22e965"} Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.268362 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zzhbf" Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.268388 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c3d55ed79ce418e250b8b133da4d7c55cf83175a157e5caa80f10777e22e965" Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.368241 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77"] Jan 10 17:03:09 crc kubenswrapper[5036]: E0110 17:03:09.368946 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7be91e0f-1820-445f-b106-0558e046ac4a" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.369045 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="7be91e0f-1820-445f-b106-0558e046ac4a" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.369345 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="7be91e0f-1820-445f-b106-0558e046ac4a" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.370248 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77" Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.374275 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.376897 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.377341 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.377723 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.378127 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.392736 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77"] Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.505253 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/993c9fcb-a10b-4d08-ae74-2bc52e9d8131-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77\" (UID: \"993c9fcb-a10b-4d08-ae74-2bc52e9d8131\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77" Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.505422 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmlt8\" (UniqueName: \"kubernetes.io/projected/993c9fcb-a10b-4d08-ae74-2bc52e9d8131-kube-api-access-mmlt8\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77\" (UID: \"993c9fcb-a10b-4d08-ae74-2bc52e9d8131\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77" Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.505551 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/993c9fcb-a10b-4d08-ae74-2bc52e9d8131-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77\" (UID: \"993c9fcb-a10b-4d08-ae74-2bc52e9d8131\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77" Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.506347 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/993c9fcb-a10b-4d08-ae74-2bc52e9d8131-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77\" (UID: \"993c9fcb-a10b-4d08-ae74-2bc52e9d8131\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77" Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.612956 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/993c9fcb-a10b-4d08-ae74-2bc52e9d8131-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77\" (UID: \"993c9fcb-a10b-4d08-ae74-2bc52e9d8131\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77" Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.614941 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/993c9fcb-a10b-4d08-ae74-2bc52e9d8131-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77\" (UID: \"993c9fcb-a10b-4d08-ae74-2bc52e9d8131\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77" Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.615143 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/993c9fcb-a10b-4d08-ae74-2bc52e9d8131-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77\" (UID: \"993c9fcb-a10b-4d08-ae74-2bc52e9d8131\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77" Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.615407 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmlt8\" (UniqueName: \"kubernetes.io/projected/993c9fcb-a10b-4d08-ae74-2bc52e9d8131-kube-api-access-mmlt8\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77\" (UID: \"993c9fcb-a10b-4d08-ae74-2bc52e9d8131\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77" Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.620826 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/993c9fcb-a10b-4d08-ae74-2bc52e9d8131-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77\" (UID: \"993c9fcb-a10b-4d08-ae74-2bc52e9d8131\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77" Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.620956 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/993c9fcb-a10b-4d08-ae74-2bc52e9d8131-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77\" (UID: \"993c9fcb-a10b-4d08-ae74-2bc52e9d8131\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77" Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.624643 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/993c9fcb-a10b-4d08-ae74-2bc52e9d8131-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77\" (UID: \"993c9fcb-a10b-4d08-ae74-2bc52e9d8131\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77" Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.637129 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmlt8\" (UniqueName: \"kubernetes.io/projected/993c9fcb-a10b-4d08-ae74-2bc52e9d8131-kube-api-access-mmlt8\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77\" (UID: \"993c9fcb-a10b-4d08-ae74-2bc52e9d8131\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77" Jan 10 17:03:09 crc kubenswrapper[5036]: I0110 17:03:09.694919 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77" Jan 10 17:03:10 crc kubenswrapper[5036]: I0110 17:03:10.093377 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77"] Jan 10 17:03:10 crc kubenswrapper[5036]: I0110 17:03:10.275162 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77" event={"ID":"993c9fcb-a10b-4d08-ae74-2bc52e9d8131","Type":"ContainerStarted","Data":"bd318895d147ec053ec4f9d0f11c04ecd55394559e3257a9e6d5f3d95e9bf443"} Jan 10 17:03:11 crc kubenswrapper[5036]: I0110 17:03:11.292818 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77" event={"ID":"993c9fcb-a10b-4d08-ae74-2bc52e9d8131","Type":"ContainerStarted","Data":"ea350d4f09abb7f319e7c2b4f6661d9ae529a4dacf60410a171549d188e21ea2"} Jan 10 17:03:11 crc kubenswrapper[5036]: I0110 17:03:11.318325 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77" podStartSLOduration=1.794812715 podStartE2EDuration="2.318309305s" podCreationTimestamp="2026-01-10 17:03:09 +0000 UTC" firstStartedPulling="2026-01-10 17:03:10.09195046 +0000 UTC m=+2111.962185964" lastFinishedPulling="2026-01-10 17:03:10.61544706 +0000 UTC m=+2112.485682554" observedRunningTime="2026-01-10 17:03:11.316267616 +0000 UTC m=+2113.186503110" watchObservedRunningTime="2026-01-10 17:03:11.318309305 +0000 UTC m=+2113.188544799" Jan 10 17:03:11 crc kubenswrapper[5036]: I0110 17:03:11.884239 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mxpcj"] Jan 10 17:03:11 crc kubenswrapper[5036]: I0110 17:03:11.886668 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mxpcj" Jan 10 17:03:11 crc kubenswrapper[5036]: I0110 17:03:11.894818 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mxpcj"] Jan 10 17:03:12 crc kubenswrapper[5036]: I0110 17:03:12.069500 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c651b9e-b6b9-434f-a931-bc1748f5acdb-catalog-content\") pod \"redhat-marketplace-mxpcj\" (UID: \"9c651b9e-b6b9-434f-a931-bc1748f5acdb\") " pod="openshift-marketplace/redhat-marketplace-mxpcj" Jan 10 17:03:12 crc kubenswrapper[5036]: I0110 17:03:12.069879 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c651b9e-b6b9-434f-a931-bc1748f5acdb-utilities\") pod \"redhat-marketplace-mxpcj\" (UID: \"9c651b9e-b6b9-434f-a931-bc1748f5acdb\") " pod="openshift-marketplace/redhat-marketplace-mxpcj" Jan 10 17:03:12 crc kubenswrapper[5036]: I0110 17:03:12.069980 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwglg\" (UniqueName: \"kubernetes.io/projected/9c651b9e-b6b9-434f-a931-bc1748f5acdb-kube-api-access-xwglg\") pod \"redhat-marketplace-mxpcj\" (UID: \"9c651b9e-b6b9-434f-a931-bc1748f5acdb\") " pod="openshift-marketplace/redhat-marketplace-mxpcj" Jan 10 17:03:12 crc kubenswrapper[5036]: I0110 17:03:12.171474 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c651b9e-b6b9-434f-a931-bc1748f5acdb-catalog-content\") pod \"redhat-marketplace-mxpcj\" (UID: \"9c651b9e-b6b9-434f-a931-bc1748f5acdb\") " pod="openshift-marketplace/redhat-marketplace-mxpcj" Jan 10 17:03:12 crc kubenswrapper[5036]: I0110 17:03:12.171640 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c651b9e-b6b9-434f-a931-bc1748f5acdb-utilities\") pod \"redhat-marketplace-mxpcj\" (UID: \"9c651b9e-b6b9-434f-a931-bc1748f5acdb\") " pod="openshift-marketplace/redhat-marketplace-mxpcj" Jan 10 17:03:12 crc kubenswrapper[5036]: I0110 17:03:12.171694 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwglg\" (UniqueName: \"kubernetes.io/projected/9c651b9e-b6b9-434f-a931-bc1748f5acdb-kube-api-access-xwglg\") pod \"redhat-marketplace-mxpcj\" (UID: \"9c651b9e-b6b9-434f-a931-bc1748f5acdb\") " pod="openshift-marketplace/redhat-marketplace-mxpcj" Jan 10 17:03:12 crc kubenswrapper[5036]: I0110 17:03:12.172326 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c651b9e-b6b9-434f-a931-bc1748f5acdb-utilities\") pod \"redhat-marketplace-mxpcj\" (UID: \"9c651b9e-b6b9-434f-a931-bc1748f5acdb\") " pod="openshift-marketplace/redhat-marketplace-mxpcj" Jan 10 17:03:12 crc kubenswrapper[5036]: I0110 17:03:12.172361 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c651b9e-b6b9-434f-a931-bc1748f5acdb-catalog-content\") pod \"redhat-marketplace-mxpcj\" (UID: \"9c651b9e-b6b9-434f-a931-bc1748f5acdb\") " pod="openshift-marketplace/redhat-marketplace-mxpcj" Jan 10 17:03:12 crc kubenswrapper[5036]: I0110 17:03:12.194812 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwglg\" (UniqueName: \"kubernetes.io/projected/9c651b9e-b6b9-434f-a931-bc1748f5acdb-kube-api-access-xwglg\") pod \"redhat-marketplace-mxpcj\" (UID: \"9c651b9e-b6b9-434f-a931-bc1748f5acdb\") " pod="openshift-marketplace/redhat-marketplace-mxpcj" Jan 10 17:03:12 crc kubenswrapper[5036]: I0110 17:03:12.209837 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mxpcj" Jan 10 17:03:12 crc kubenswrapper[5036]: I0110 17:03:12.686930 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mxpcj"] Jan 10 17:03:12 crc kubenswrapper[5036]: W0110 17:03:12.692299 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c651b9e_b6b9_434f_a931_bc1748f5acdb.slice/crio-4d610701a6295aded075bc903b698a76e58042e81456e632bb832a350df1e71f WatchSource:0}: Error finding container 4d610701a6295aded075bc903b698a76e58042e81456e632bb832a350df1e71f: Status 404 returned error can't find the container with id 4d610701a6295aded075bc903b698a76e58042e81456e632bb832a350df1e71f Jan 10 17:03:13 crc kubenswrapper[5036]: I0110 17:03:13.312181 5036 generic.go:334] "Generic (PLEG): container finished" podID="9c651b9e-b6b9-434f-a931-bc1748f5acdb" containerID="a38de30b9640a9a02a2adfead1001922714a9837a4bdb1e6bf3f645363803e1c" exitCode=0 Jan 10 17:03:13 crc kubenswrapper[5036]: I0110 17:03:13.312290 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxpcj" event={"ID":"9c651b9e-b6b9-434f-a931-bc1748f5acdb","Type":"ContainerDied","Data":"a38de30b9640a9a02a2adfead1001922714a9837a4bdb1e6bf3f645363803e1c"} Jan 10 17:03:13 crc kubenswrapper[5036]: I0110 17:03:13.312649 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxpcj" event={"ID":"9c651b9e-b6b9-434f-a931-bc1748f5acdb","Type":"ContainerStarted","Data":"4d610701a6295aded075bc903b698a76e58042e81456e632bb832a350df1e71f"} Jan 10 17:03:15 crc kubenswrapper[5036]: I0110 17:03:15.332900 5036 generic.go:334] "Generic (PLEG): container finished" podID="9c651b9e-b6b9-434f-a931-bc1748f5acdb" containerID="b074366aa8eb56de1dccf62b82bd6ff42c4c334ae176b9527d0f21adfffb03a2" exitCode=0 Jan 10 17:03:15 crc kubenswrapper[5036]: I0110 17:03:15.333101 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxpcj" event={"ID":"9c651b9e-b6b9-434f-a931-bc1748f5acdb","Type":"ContainerDied","Data":"b074366aa8eb56de1dccf62b82bd6ff42c4c334ae176b9527d0f21adfffb03a2"} Jan 10 17:03:15 crc kubenswrapper[5036]: I0110 17:03:15.338927 5036 generic.go:334] "Generic (PLEG): container finished" podID="993c9fcb-a10b-4d08-ae74-2bc52e9d8131" containerID="ea350d4f09abb7f319e7c2b4f6661d9ae529a4dacf60410a171549d188e21ea2" exitCode=0 Jan 10 17:03:15 crc kubenswrapper[5036]: I0110 17:03:15.338957 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77" event={"ID":"993c9fcb-a10b-4d08-ae74-2bc52e9d8131","Type":"ContainerDied","Data":"ea350d4f09abb7f319e7c2b4f6661d9ae529a4dacf60410a171549d188e21ea2"} Jan 10 17:03:16 crc kubenswrapper[5036]: I0110 17:03:16.349080 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxpcj" event={"ID":"9c651b9e-b6b9-434f-a931-bc1748f5acdb","Type":"ContainerStarted","Data":"c8f3f1e6ce754b46afe7c187acf3bcc59854695edacdbe4be9eec596bde84c2c"} Jan 10 17:03:16 crc kubenswrapper[5036]: I0110 17:03:16.377217 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mxpcj" podStartSLOduration=2.854194448 podStartE2EDuration="5.377195151s" podCreationTimestamp="2026-01-10 17:03:11 +0000 UTC" firstStartedPulling="2026-01-10 17:03:13.314571637 +0000 UTC m=+2115.184807171" lastFinishedPulling="2026-01-10 17:03:15.83757234 +0000 UTC m=+2117.707807874" observedRunningTime="2026-01-10 17:03:16.372503267 +0000 UTC m=+2118.242738761" watchObservedRunningTime="2026-01-10 17:03:16.377195151 +0000 UTC m=+2118.247430655" Jan 10 17:03:16 crc kubenswrapper[5036]: I0110 17:03:16.820735 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77" Jan 10 17:03:16 crc kubenswrapper[5036]: I0110 17:03:16.959747 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmlt8\" (UniqueName: \"kubernetes.io/projected/993c9fcb-a10b-4d08-ae74-2bc52e9d8131-kube-api-access-mmlt8\") pod \"993c9fcb-a10b-4d08-ae74-2bc52e9d8131\" (UID: \"993c9fcb-a10b-4d08-ae74-2bc52e9d8131\") " Jan 10 17:03:16 crc kubenswrapper[5036]: I0110 17:03:16.959829 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/993c9fcb-a10b-4d08-ae74-2bc52e9d8131-ssh-key-openstack-edpm-ipam\") pod \"993c9fcb-a10b-4d08-ae74-2bc52e9d8131\" (UID: \"993c9fcb-a10b-4d08-ae74-2bc52e9d8131\") " Jan 10 17:03:16 crc kubenswrapper[5036]: I0110 17:03:16.959868 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/993c9fcb-a10b-4d08-ae74-2bc52e9d8131-inventory\") pod \"993c9fcb-a10b-4d08-ae74-2bc52e9d8131\" (UID: \"993c9fcb-a10b-4d08-ae74-2bc52e9d8131\") " Jan 10 17:03:16 crc kubenswrapper[5036]: I0110 17:03:16.959989 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/993c9fcb-a10b-4d08-ae74-2bc52e9d8131-ceph\") pod \"993c9fcb-a10b-4d08-ae74-2bc52e9d8131\" (UID: \"993c9fcb-a10b-4d08-ae74-2bc52e9d8131\") " Jan 10 17:03:16 crc kubenswrapper[5036]: I0110 17:03:16.965355 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/993c9fcb-a10b-4d08-ae74-2bc52e9d8131-ceph" (OuterVolumeSpecName: "ceph") pod "993c9fcb-a10b-4d08-ae74-2bc52e9d8131" (UID: "993c9fcb-a10b-4d08-ae74-2bc52e9d8131"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:03:16 crc kubenswrapper[5036]: I0110 17:03:16.965722 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/993c9fcb-a10b-4d08-ae74-2bc52e9d8131-kube-api-access-mmlt8" (OuterVolumeSpecName: "kube-api-access-mmlt8") pod "993c9fcb-a10b-4d08-ae74-2bc52e9d8131" (UID: "993c9fcb-a10b-4d08-ae74-2bc52e9d8131"). InnerVolumeSpecName "kube-api-access-mmlt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:03:16 crc kubenswrapper[5036]: I0110 17:03:16.984632 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/993c9fcb-a10b-4d08-ae74-2bc52e9d8131-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "993c9fcb-a10b-4d08-ae74-2bc52e9d8131" (UID: "993c9fcb-a10b-4d08-ae74-2bc52e9d8131"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:03:16 crc kubenswrapper[5036]: I0110 17:03:16.986991 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/993c9fcb-a10b-4d08-ae74-2bc52e9d8131-inventory" (OuterVolumeSpecName: "inventory") pod "993c9fcb-a10b-4d08-ae74-2bc52e9d8131" (UID: "993c9fcb-a10b-4d08-ae74-2bc52e9d8131"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.062073 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/993c9fcb-a10b-4d08-ae74-2bc52e9d8131-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.062110 5036 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/993c9fcb-a10b-4d08-ae74-2bc52e9d8131-inventory\") on node \"crc\" DevicePath \"\"" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.062119 5036 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/993c9fcb-a10b-4d08-ae74-2bc52e9d8131-ceph\") on node \"crc\" DevicePath \"\"" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.062128 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmlt8\" (UniqueName: \"kubernetes.io/projected/993c9fcb-a10b-4d08-ae74-2bc52e9d8131-kube-api-access-mmlt8\") on node \"crc\" DevicePath \"\"" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.359173 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.359228 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77" event={"ID":"993c9fcb-a10b-4d08-ae74-2bc52e9d8131","Type":"ContainerDied","Data":"bd318895d147ec053ec4f9d0f11c04ecd55394559e3257a9e6d5f3d95e9bf443"} Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.359258 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd318895d147ec053ec4f9d0f11c04ecd55394559e3257a9e6d5f3d95e9bf443" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.472530 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4gng"] Jan 10 17:03:17 crc kubenswrapper[5036]: E0110 17:03:17.472983 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="993c9fcb-a10b-4d08-ae74-2bc52e9d8131" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.473007 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="993c9fcb-a10b-4d08-ae74-2bc52e9d8131" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.473276 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="993c9fcb-a10b-4d08-ae74-2bc52e9d8131" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.474094 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4gng" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.476463 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.476566 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.476869 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.476959 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.477768 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.482726 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4gng"] Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.669952 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bf597c03-b76a-445a-84d3-034d70ca102e-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4gng\" (UID: \"bf597c03-b76a-445a-84d3-034d70ca102e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4gng" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.670794 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf597c03-b76a-445a-84d3-034d70ca102e-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4gng\" (UID: \"bf597c03-b76a-445a-84d3-034d70ca102e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4gng" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.671396 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdjdm\" (UniqueName: \"kubernetes.io/projected/bf597c03-b76a-445a-84d3-034d70ca102e-kube-api-access-hdjdm\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4gng\" (UID: \"bf597c03-b76a-445a-84d3-034d70ca102e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4gng" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.671634 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf597c03-b76a-445a-84d3-034d70ca102e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4gng\" (UID: \"bf597c03-b76a-445a-84d3-034d70ca102e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4gng" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.772932 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdjdm\" (UniqueName: \"kubernetes.io/projected/bf597c03-b76a-445a-84d3-034d70ca102e-kube-api-access-hdjdm\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4gng\" (UID: \"bf597c03-b76a-445a-84d3-034d70ca102e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4gng" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.773292 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf597c03-b76a-445a-84d3-034d70ca102e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4gng\" (UID: \"bf597c03-b76a-445a-84d3-034d70ca102e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4gng" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.773467 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bf597c03-b76a-445a-84d3-034d70ca102e-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4gng\" (UID: \"bf597c03-b76a-445a-84d3-034d70ca102e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4gng" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.773620 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf597c03-b76a-445a-84d3-034d70ca102e-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4gng\" (UID: \"bf597c03-b76a-445a-84d3-034d70ca102e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4gng" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.778521 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf597c03-b76a-445a-84d3-034d70ca102e-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4gng\" (UID: \"bf597c03-b76a-445a-84d3-034d70ca102e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4gng" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.778540 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bf597c03-b76a-445a-84d3-034d70ca102e-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4gng\" (UID: \"bf597c03-b76a-445a-84d3-034d70ca102e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4gng" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.778794 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf597c03-b76a-445a-84d3-034d70ca102e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4gng\" (UID: \"bf597c03-b76a-445a-84d3-034d70ca102e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4gng" Jan 10 17:03:17 crc kubenswrapper[5036]: I0110 17:03:17.794177 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdjdm\" (UniqueName: \"kubernetes.io/projected/bf597c03-b76a-445a-84d3-034d70ca102e-kube-api-access-hdjdm\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-b4gng\" (UID: \"bf597c03-b76a-445a-84d3-034d70ca102e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4gng" Jan 10 17:03:18 crc kubenswrapper[5036]: I0110 17:03:18.091786 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4gng" Jan 10 17:03:18 crc kubenswrapper[5036]: I0110 17:03:18.665700 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4gng"] Jan 10 17:03:18 crc kubenswrapper[5036]: W0110 17:03:18.666501 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf597c03_b76a_445a_84d3_034d70ca102e.slice/crio-7f9a93d9be437f4fdcda8e28c7700d6c1cb743b94847b4333b36053f2fcc4dea WatchSource:0}: Error finding container 7f9a93d9be437f4fdcda8e28c7700d6c1cb743b94847b4333b36053f2fcc4dea: Status 404 returned error can't find the container with id 7f9a93d9be437f4fdcda8e28c7700d6c1cb743b94847b4333b36053f2fcc4dea Jan 10 17:03:19 crc kubenswrapper[5036]: I0110 17:03:19.385155 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4gng" event={"ID":"bf597c03-b76a-445a-84d3-034d70ca102e","Type":"ContainerStarted","Data":"7f9a93d9be437f4fdcda8e28c7700d6c1cb743b94847b4333b36053f2fcc4dea"} Jan 10 17:03:20 crc kubenswrapper[5036]: I0110 17:03:20.396627 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4gng" event={"ID":"bf597c03-b76a-445a-84d3-034d70ca102e","Type":"ContainerStarted","Data":"1af236997f0e330b3ee4af2be22c88567138c483d380c6d36aef6841829609c0"} Jan 10 17:03:20 crc kubenswrapper[5036]: I0110 17:03:20.428161 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4gng" podStartSLOduration=2.893235849 podStartE2EDuration="3.428142466s" podCreationTimestamp="2026-01-10 17:03:17 +0000 UTC" firstStartedPulling="2026-01-10 17:03:18.669612362 +0000 UTC m=+2120.539847856" lastFinishedPulling="2026-01-10 17:03:19.204518939 +0000 UTC m=+2121.074754473" observedRunningTime="2026-01-10 17:03:20.421755633 +0000 UTC m=+2122.291991157" watchObservedRunningTime="2026-01-10 17:03:20.428142466 +0000 UTC m=+2122.298377970" Jan 10 17:03:22 crc kubenswrapper[5036]: I0110 17:03:22.210640 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mxpcj" Jan 10 17:03:22 crc kubenswrapper[5036]: I0110 17:03:22.211164 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mxpcj" Jan 10 17:03:22 crc kubenswrapper[5036]: I0110 17:03:22.281086 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mxpcj" Jan 10 17:03:22 crc kubenswrapper[5036]: I0110 17:03:22.453186 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mxpcj" Jan 10 17:03:22 crc kubenswrapper[5036]: I0110 17:03:22.523307 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mxpcj"] Jan 10 17:03:24 crc kubenswrapper[5036]: I0110 17:03:24.421829 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mxpcj" podUID="9c651b9e-b6b9-434f-a931-bc1748f5acdb" containerName="registry-server" containerID="cri-o://c8f3f1e6ce754b46afe7c187acf3bcc59854695edacdbe4be9eec596bde84c2c" gracePeriod=2 Jan 10 17:03:24 crc kubenswrapper[5036]: I0110 17:03:24.903179 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mxpcj" Jan 10 17:03:24 crc kubenswrapper[5036]: I0110 17:03:24.924744 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c651b9e-b6b9-434f-a931-bc1748f5acdb-catalog-content\") pod \"9c651b9e-b6b9-434f-a931-bc1748f5acdb\" (UID: \"9c651b9e-b6b9-434f-a931-bc1748f5acdb\") " Jan 10 17:03:24 crc kubenswrapper[5036]: I0110 17:03:24.924832 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwglg\" (UniqueName: \"kubernetes.io/projected/9c651b9e-b6b9-434f-a931-bc1748f5acdb-kube-api-access-xwglg\") pod \"9c651b9e-b6b9-434f-a931-bc1748f5acdb\" (UID: \"9c651b9e-b6b9-434f-a931-bc1748f5acdb\") " Jan 10 17:03:24 crc kubenswrapper[5036]: I0110 17:03:24.924868 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c651b9e-b6b9-434f-a931-bc1748f5acdb-utilities\") pod \"9c651b9e-b6b9-434f-a931-bc1748f5acdb\" (UID: \"9c651b9e-b6b9-434f-a931-bc1748f5acdb\") " Jan 10 17:03:24 crc kubenswrapper[5036]: I0110 17:03:24.926232 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c651b9e-b6b9-434f-a931-bc1748f5acdb-utilities" (OuterVolumeSpecName: "utilities") pod "9c651b9e-b6b9-434f-a931-bc1748f5acdb" (UID: "9c651b9e-b6b9-434f-a931-bc1748f5acdb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:03:24 crc kubenswrapper[5036]: I0110 17:03:24.931574 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c651b9e-b6b9-434f-a931-bc1748f5acdb-kube-api-access-xwglg" (OuterVolumeSpecName: "kube-api-access-xwglg") pod "9c651b9e-b6b9-434f-a931-bc1748f5acdb" (UID: "9c651b9e-b6b9-434f-a931-bc1748f5acdb"). InnerVolumeSpecName "kube-api-access-xwglg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:03:24 crc kubenswrapper[5036]: I0110 17:03:24.951734 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c651b9e-b6b9-434f-a931-bc1748f5acdb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c651b9e-b6b9-434f-a931-bc1748f5acdb" (UID: "9c651b9e-b6b9-434f-a931-bc1748f5acdb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:03:25 crc kubenswrapper[5036]: I0110 17:03:25.027230 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c651b9e-b6b9-434f-a931-bc1748f5acdb-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 17:03:25 crc kubenswrapper[5036]: I0110 17:03:25.027272 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwglg\" (UniqueName: \"kubernetes.io/projected/9c651b9e-b6b9-434f-a931-bc1748f5acdb-kube-api-access-xwglg\") on node \"crc\" DevicePath \"\"" Jan 10 17:03:25 crc kubenswrapper[5036]: I0110 17:03:25.027284 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c651b9e-b6b9-434f-a931-bc1748f5acdb-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 17:03:25 crc kubenswrapper[5036]: I0110 17:03:25.432500 5036 generic.go:334] "Generic (PLEG): container finished" podID="9c651b9e-b6b9-434f-a931-bc1748f5acdb" containerID="c8f3f1e6ce754b46afe7c187acf3bcc59854695edacdbe4be9eec596bde84c2c" exitCode=0 Jan 10 17:03:25 crc kubenswrapper[5036]: I0110 17:03:25.432546 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxpcj" event={"ID":"9c651b9e-b6b9-434f-a931-bc1748f5acdb","Type":"ContainerDied","Data":"c8f3f1e6ce754b46afe7c187acf3bcc59854695edacdbe4be9eec596bde84c2c"} Jan 10 17:03:25 crc kubenswrapper[5036]: I0110 17:03:25.432576 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mxpcj" event={"ID":"9c651b9e-b6b9-434f-a931-bc1748f5acdb","Type":"ContainerDied","Data":"4d610701a6295aded075bc903b698a76e58042e81456e632bb832a350df1e71f"} Jan 10 17:03:25 crc kubenswrapper[5036]: I0110 17:03:25.432598 5036 scope.go:117] "RemoveContainer" containerID="c8f3f1e6ce754b46afe7c187acf3bcc59854695edacdbe4be9eec596bde84c2c" Jan 10 17:03:25 crc kubenswrapper[5036]: I0110 17:03:25.432622 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mxpcj" Jan 10 17:03:25 crc kubenswrapper[5036]: I0110 17:03:25.454302 5036 scope.go:117] "RemoveContainer" containerID="b074366aa8eb56de1dccf62b82bd6ff42c4c334ae176b9527d0f21adfffb03a2" Jan 10 17:03:25 crc kubenswrapper[5036]: I0110 17:03:25.473820 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mxpcj"] Jan 10 17:03:25 crc kubenswrapper[5036]: I0110 17:03:25.475378 5036 scope.go:117] "RemoveContainer" containerID="a38de30b9640a9a02a2adfead1001922714a9837a4bdb1e6bf3f645363803e1c" Jan 10 17:03:25 crc kubenswrapper[5036]: I0110 17:03:25.480438 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mxpcj"] Jan 10 17:03:25 crc kubenswrapper[5036]: I0110 17:03:25.513641 5036 scope.go:117] "RemoveContainer" containerID="c8f3f1e6ce754b46afe7c187acf3bcc59854695edacdbe4be9eec596bde84c2c" Jan 10 17:03:25 crc kubenswrapper[5036]: E0110 17:03:25.514089 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8f3f1e6ce754b46afe7c187acf3bcc59854695edacdbe4be9eec596bde84c2c\": container with ID starting with c8f3f1e6ce754b46afe7c187acf3bcc59854695edacdbe4be9eec596bde84c2c not found: ID does not exist" containerID="c8f3f1e6ce754b46afe7c187acf3bcc59854695edacdbe4be9eec596bde84c2c" Jan 10 17:03:25 crc kubenswrapper[5036]: I0110 17:03:25.514124 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8f3f1e6ce754b46afe7c187acf3bcc59854695edacdbe4be9eec596bde84c2c"} err="failed to get container status \"c8f3f1e6ce754b46afe7c187acf3bcc59854695edacdbe4be9eec596bde84c2c\": rpc error: code = NotFound desc = could not find container \"c8f3f1e6ce754b46afe7c187acf3bcc59854695edacdbe4be9eec596bde84c2c\": container with ID starting with c8f3f1e6ce754b46afe7c187acf3bcc59854695edacdbe4be9eec596bde84c2c not found: ID does not exist" Jan 10 17:03:25 crc kubenswrapper[5036]: I0110 17:03:25.514155 5036 scope.go:117] "RemoveContainer" containerID="b074366aa8eb56de1dccf62b82bd6ff42c4c334ae176b9527d0f21adfffb03a2" Jan 10 17:03:25 crc kubenswrapper[5036]: E0110 17:03:25.514411 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b074366aa8eb56de1dccf62b82bd6ff42c4c334ae176b9527d0f21adfffb03a2\": container with ID starting with b074366aa8eb56de1dccf62b82bd6ff42c4c334ae176b9527d0f21adfffb03a2 not found: ID does not exist" containerID="b074366aa8eb56de1dccf62b82bd6ff42c4c334ae176b9527d0f21adfffb03a2" Jan 10 17:03:25 crc kubenswrapper[5036]: I0110 17:03:25.514433 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b074366aa8eb56de1dccf62b82bd6ff42c4c334ae176b9527d0f21adfffb03a2"} err="failed to get container status \"b074366aa8eb56de1dccf62b82bd6ff42c4c334ae176b9527d0f21adfffb03a2\": rpc error: code = NotFound desc = could not find container \"b074366aa8eb56de1dccf62b82bd6ff42c4c334ae176b9527d0f21adfffb03a2\": container with ID starting with b074366aa8eb56de1dccf62b82bd6ff42c4c334ae176b9527d0f21adfffb03a2 not found: ID does not exist" Jan 10 17:03:25 crc kubenswrapper[5036]: I0110 17:03:25.514453 5036 scope.go:117] "RemoveContainer" containerID="a38de30b9640a9a02a2adfead1001922714a9837a4bdb1e6bf3f645363803e1c" Jan 10 17:03:25 crc kubenswrapper[5036]: E0110 17:03:25.514981 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a38de30b9640a9a02a2adfead1001922714a9837a4bdb1e6bf3f645363803e1c\": container with ID starting with a38de30b9640a9a02a2adfead1001922714a9837a4bdb1e6bf3f645363803e1c not found: ID does not exist" containerID="a38de30b9640a9a02a2adfead1001922714a9837a4bdb1e6bf3f645363803e1c" Jan 10 17:03:25 crc kubenswrapper[5036]: I0110 17:03:25.515010 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38de30b9640a9a02a2adfead1001922714a9837a4bdb1e6bf3f645363803e1c"} err="failed to get container status \"a38de30b9640a9a02a2adfead1001922714a9837a4bdb1e6bf3f645363803e1c\": rpc error: code = NotFound desc = could not find container \"a38de30b9640a9a02a2adfead1001922714a9837a4bdb1e6bf3f645363803e1c\": container with ID starting with a38de30b9640a9a02a2adfead1001922714a9837a4bdb1e6bf3f645363803e1c not found: ID does not exist" Jan 10 17:03:25 crc kubenswrapper[5036]: I0110 17:03:25.904920 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 17:03:25 crc kubenswrapper[5036]: I0110 17:03:25.904995 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 17:03:26 crc kubenswrapper[5036]: I0110 17:03:26.522822 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c651b9e-b6b9-434f-a931-bc1748f5acdb" path="/var/lib/kubelet/pods/9c651b9e-b6b9-434f-a931-bc1748f5acdb/volumes" Jan 10 17:03:28 crc kubenswrapper[5036]: I0110 17:03:28.797693 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pv4sb"] Jan 10 17:03:28 crc kubenswrapper[5036]: E0110 17:03:28.798487 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c651b9e-b6b9-434f-a931-bc1748f5acdb" containerName="registry-server" Jan 10 17:03:28 crc kubenswrapper[5036]: I0110 17:03:28.798503 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c651b9e-b6b9-434f-a931-bc1748f5acdb" containerName="registry-server" Jan 10 17:03:28 crc kubenswrapper[5036]: E0110 17:03:28.798513 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c651b9e-b6b9-434f-a931-bc1748f5acdb" containerName="extract-utilities" Jan 10 17:03:28 crc kubenswrapper[5036]: I0110 17:03:28.798521 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c651b9e-b6b9-434f-a931-bc1748f5acdb" containerName="extract-utilities" Jan 10 17:03:28 crc kubenswrapper[5036]: E0110 17:03:28.798547 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c651b9e-b6b9-434f-a931-bc1748f5acdb" containerName="extract-content" Jan 10 17:03:28 crc kubenswrapper[5036]: I0110 17:03:28.798555 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c651b9e-b6b9-434f-a931-bc1748f5acdb" containerName="extract-content" Jan 10 17:03:28 crc kubenswrapper[5036]: I0110 17:03:28.798785 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c651b9e-b6b9-434f-a931-bc1748f5acdb" containerName="registry-server" Jan 10 17:03:28 crc kubenswrapper[5036]: I0110 17:03:28.800405 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pv4sb" Jan 10 17:03:28 crc kubenswrapper[5036]: I0110 17:03:28.815925 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pv4sb"] Jan 10 17:03:28 crc kubenswrapper[5036]: I0110 17:03:28.931121 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzkrp\" (UniqueName: \"kubernetes.io/projected/1835af21-debc-47a7-a7a8-05795ea58a44-kube-api-access-tzkrp\") pod \"redhat-operators-pv4sb\" (UID: \"1835af21-debc-47a7-a7a8-05795ea58a44\") " pod="openshift-marketplace/redhat-operators-pv4sb" Jan 10 17:03:28 crc kubenswrapper[5036]: I0110 17:03:28.931166 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1835af21-debc-47a7-a7a8-05795ea58a44-utilities\") pod \"redhat-operators-pv4sb\" (UID: \"1835af21-debc-47a7-a7a8-05795ea58a44\") " pod="openshift-marketplace/redhat-operators-pv4sb" Jan 10 17:03:28 crc kubenswrapper[5036]: I0110 17:03:28.931637 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1835af21-debc-47a7-a7a8-05795ea58a44-catalog-content\") pod \"redhat-operators-pv4sb\" (UID: \"1835af21-debc-47a7-a7a8-05795ea58a44\") " pod="openshift-marketplace/redhat-operators-pv4sb" Jan 10 17:03:29 crc kubenswrapper[5036]: I0110 17:03:29.033439 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1835af21-debc-47a7-a7a8-05795ea58a44-catalog-content\") pod \"redhat-operators-pv4sb\" (UID: \"1835af21-debc-47a7-a7a8-05795ea58a44\") " pod="openshift-marketplace/redhat-operators-pv4sb" Jan 10 17:03:29 crc kubenswrapper[5036]: I0110 17:03:29.033550 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzkrp\" (UniqueName: \"kubernetes.io/projected/1835af21-debc-47a7-a7a8-05795ea58a44-kube-api-access-tzkrp\") pod \"redhat-operators-pv4sb\" (UID: \"1835af21-debc-47a7-a7a8-05795ea58a44\") " pod="openshift-marketplace/redhat-operators-pv4sb" Jan 10 17:03:29 crc kubenswrapper[5036]: I0110 17:03:29.033570 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1835af21-debc-47a7-a7a8-05795ea58a44-utilities\") pod \"redhat-operators-pv4sb\" (UID: \"1835af21-debc-47a7-a7a8-05795ea58a44\") " pod="openshift-marketplace/redhat-operators-pv4sb" Jan 10 17:03:29 crc kubenswrapper[5036]: I0110 17:03:29.034041 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1835af21-debc-47a7-a7a8-05795ea58a44-catalog-content\") pod \"redhat-operators-pv4sb\" (UID: \"1835af21-debc-47a7-a7a8-05795ea58a44\") " pod="openshift-marketplace/redhat-operators-pv4sb" Jan 10 17:03:29 crc kubenswrapper[5036]: I0110 17:03:29.034069 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1835af21-debc-47a7-a7a8-05795ea58a44-utilities\") pod \"redhat-operators-pv4sb\" (UID: \"1835af21-debc-47a7-a7a8-05795ea58a44\") " pod="openshift-marketplace/redhat-operators-pv4sb" Jan 10 17:03:29 crc kubenswrapper[5036]: I0110 17:03:29.060630 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzkrp\" (UniqueName: \"kubernetes.io/projected/1835af21-debc-47a7-a7a8-05795ea58a44-kube-api-access-tzkrp\") pod \"redhat-operators-pv4sb\" (UID: \"1835af21-debc-47a7-a7a8-05795ea58a44\") " pod="openshift-marketplace/redhat-operators-pv4sb" Jan 10 17:03:29 crc kubenswrapper[5036]: I0110 17:03:29.135741 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pv4sb" Jan 10 17:03:29 crc kubenswrapper[5036]: I0110 17:03:29.823672 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pv4sb"] Jan 10 17:03:30 crc kubenswrapper[5036]: I0110 17:03:30.478910 5036 generic.go:334] "Generic (PLEG): container finished" podID="1835af21-debc-47a7-a7a8-05795ea58a44" containerID="b63b6ac742cea22e2ee4cb05ff52e640d0f607b00b2692fe7e9014cbc39179b7" exitCode=0 Jan 10 17:03:30 crc kubenswrapper[5036]: I0110 17:03:30.479182 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pv4sb" event={"ID":"1835af21-debc-47a7-a7a8-05795ea58a44","Type":"ContainerDied","Data":"b63b6ac742cea22e2ee4cb05ff52e640d0f607b00b2692fe7e9014cbc39179b7"} Jan 10 17:03:30 crc kubenswrapper[5036]: I0110 17:03:30.479453 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pv4sb" event={"ID":"1835af21-debc-47a7-a7a8-05795ea58a44","Type":"ContainerStarted","Data":"81fce7adeec3d8d3a466e071b23c8a3a1e552b5cfcb68e5b052a3d92445dab78"} Jan 10 17:03:32 crc kubenswrapper[5036]: I0110 17:03:32.494424 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pv4sb" event={"ID":"1835af21-debc-47a7-a7a8-05795ea58a44","Type":"ContainerStarted","Data":"027886b4b8f6fe219a69c064cf00ef69c1de730b57a31c7ce749d311d99fecf0"} Jan 10 17:03:34 crc kubenswrapper[5036]: I0110 17:03:34.512452 5036 generic.go:334] "Generic (PLEG): container finished" podID="1835af21-debc-47a7-a7a8-05795ea58a44" containerID="027886b4b8f6fe219a69c064cf00ef69c1de730b57a31c7ce749d311d99fecf0" exitCode=0 Jan 10 17:03:34 crc kubenswrapper[5036]: I0110 17:03:34.519188 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pv4sb" event={"ID":"1835af21-debc-47a7-a7a8-05795ea58a44","Type":"ContainerDied","Data":"027886b4b8f6fe219a69c064cf00ef69c1de730b57a31c7ce749d311d99fecf0"} Jan 10 17:03:36 crc kubenswrapper[5036]: I0110 17:03:36.546988 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pv4sb" event={"ID":"1835af21-debc-47a7-a7a8-05795ea58a44","Type":"ContainerStarted","Data":"bbf820267ee754f4badfb0982e323fc4899e29b731388c8bf5c53efc945a53e8"} Jan 10 17:03:36 crc kubenswrapper[5036]: I0110 17:03:36.579732 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pv4sb" podStartSLOduration=3.137304209 podStartE2EDuration="8.579713977s" podCreationTimestamp="2026-01-10 17:03:28 +0000 UTC" firstStartedPulling="2026-01-10 17:03:30.480869834 +0000 UTC m=+2132.351105338" lastFinishedPulling="2026-01-10 17:03:35.923279612 +0000 UTC m=+2137.793515106" observedRunningTime="2026-01-10 17:03:36.571984296 +0000 UTC m=+2138.442219800" watchObservedRunningTime="2026-01-10 17:03:36.579713977 +0000 UTC m=+2138.449949471" Jan 10 17:03:39 crc kubenswrapper[5036]: I0110 17:03:39.136251 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pv4sb" Jan 10 17:03:39 crc kubenswrapper[5036]: I0110 17:03:39.136565 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pv4sb" Jan 10 17:03:40 crc kubenswrapper[5036]: I0110 17:03:40.188728 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pv4sb" podUID="1835af21-debc-47a7-a7a8-05795ea58a44" containerName="registry-server" probeResult="failure" output=< Jan 10 17:03:40 crc kubenswrapper[5036]: timeout: failed to connect service ":50051" within 1s Jan 10 17:03:40 crc kubenswrapper[5036]: > Jan 10 17:03:49 crc kubenswrapper[5036]: I0110 17:03:49.191291 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pv4sb" Jan 10 17:03:49 crc kubenswrapper[5036]: I0110 17:03:49.257923 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pv4sb" Jan 10 17:03:49 crc kubenswrapper[5036]: I0110 17:03:49.434502 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pv4sb"] Jan 10 17:03:50 crc kubenswrapper[5036]: I0110 17:03:50.689438 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pv4sb" podUID="1835af21-debc-47a7-a7a8-05795ea58a44" containerName="registry-server" containerID="cri-o://bbf820267ee754f4badfb0982e323fc4899e29b731388c8bf5c53efc945a53e8" gracePeriod=2 Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.111531 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pv4sb" Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.248528 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1835af21-debc-47a7-a7a8-05795ea58a44-utilities\") pod \"1835af21-debc-47a7-a7a8-05795ea58a44\" (UID: \"1835af21-debc-47a7-a7a8-05795ea58a44\") " Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.248940 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzkrp\" (UniqueName: \"kubernetes.io/projected/1835af21-debc-47a7-a7a8-05795ea58a44-kube-api-access-tzkrp\") pod \"1835af21-debc-47a7-a7a8-05795ea58a44\" (UID: \"1835af21-debc-47a7-a7a8-05795ea58a44\") " Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.249101 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1835af21-debc-47a7-a7a8-05795ea58a44-catalog-content\") pod \"1835af21-debc-47a7-a7a8-05795ea58a44\" (UID: \"1835af21-debc-47a7-a7a8-05795ea58a44\") " Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.249787 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1835af21-debc-47a7-a7a8-05795ea58a44-utilities" (OuterVolumeSpecName: "utilities") pod "1835af21-debc-47a7-a7a8-05795ea58a44" (UID: "1835af21-debc-47a7-a7a8-05795ea58a44"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.258919 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1835af21-debc-47a7-a7a8-05795ea58a44-kube-api-access-tzkrp" (OuterVolumeSpecName: "kube-api-access-tzkrp") pod "1835af21-debc-47a7-a7a8-05795ea58a44" (UID: "1835af21-debc-47a7-a7a8-05795ea58a44"). InnerVolumeSpecName "kube-api-access-tzkrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.351390 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1835af21-debc-47a7-a7a8-05795ea58a44-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.351430 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzkrp\" (UniqueName: \"kubernetes.io/projected/1835af21-debc-47a7-a7a8-05795ea58a44-kube-api-access-tzkrp\") on node \"crc\" DevicePath \"\"" Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.374476 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1835af21-debc-47a7-a7a8-05795ea58a44-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1835af21-debc-47a7-a7a8-05795ea58a44" (UID: "1835af21-debc-47a7-a7a8-05795ea58a44"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.453391 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1835af21-debc-47a7-a7a8-05795ea58a44-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.699208 5036 generic.go:334] "Generic (PLEG): container finished" podID="1835af21-debc-47a7-a7a8-05795ea58a44" containerID="bbf820267ee754f4badfb0982e323fc4899e29b731388c8bf5c53efc945a53e8" exitCode=0 Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.699269 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pv4sb" Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.699266 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pv4sb" event={"ID":"1835af21-debc-47a7-a7a8-05795ea58a44","Type":"ContainerDied","Data":"bbf820267ee754f4badfb0982e323fc4899e29b731388c8bf5c53efc945a53e8"} Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.699460 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pv4sb" event={"ID":"1835af21-debc-47a7-a7a8-05795ea58a44","Type":"ContainerDied","Data":"81fce7adeec3d8d3a466e071b23c8a3a1e552b5cfcb68e5b052a3d92445dab78"} Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.699489 5036 scope.go:117] "RemoveContainer" containerID="bbf820267ee754f4badfb0982e323fc4899e29b731388c8bf5c53efc945a53e8" Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.727415 5036 scope.go:117] "RemoveContainer" containerID="027886b4b8f6fe219a69c064cf00ef69c1de730b57a31c7ce749d311d99fecf0" Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.737979 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pv4sb"] Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.746763 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pv4sb"] Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.762427 5036 scope.go:117] "RemoveContainer" containerID="b63b6ac742cea22e2ee4cb05ff52e640d0f607b00b2692fe7e9014cbc39179b7" Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.788886 5036 scope.go:117] "RemoveContainer" containerID="bbf820267ee754f4badfb0982e323fc4899e29b731388c8bf5c53efc945a53e8" Jan 10 17:03:51 crc kubenswrapper[5036]: E0110 17:03:51.789229 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbf820267ee754f4badfb0982e323fc4899e29b731388c8bf5c53efc945a53e8\": container with ID starting with bbf820267ee754f4badfb0982e323fc4899e29b731388c8bf5c53efc945a53e8 not found: ID does not exist" containerID="bbf820267ee754f4badfb0982e323fc4899e29b731388c8bf5c53efc945a53e8" Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.789263 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbf820267ee754f4badfb0982e323fc4899e29b731388c8bf5c53efc945a53e8"} err="failed to get container status \"bbf820267ee754f4badfb0982e323fc4899e29b731388c8bf5c53efc945a53e8\": rpc error: code = NotFound desc = could not find container \"bbf820267ee754f4badfb0982e323fc4899e29b731388c8bf5c53efc945a53e8\": container with ID starting with bbf820267ee754f4badfb0982e323fc4899e29b731388c8bf5c53efc945a53e8 not found: ID does not exist" Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.789286 5036 scope.go:117] "RemoveContainer" containerID="027886b4b8f6fe219a69c064cf00ef69c1de730b57a31c7ce749d311d99fecf0" Jan 10 17:03:51 crc kubenswrapper[5036]: E0110 17:03:51.789553 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"027886b4b8f6fe219a69c064cf00ef69c1de730b57a31c7ce749d311d99fecf0\": container with ID starting with 027886b4b8f6fe219a69c064cf00ef69c1de730b57a31c7ce749d311d99fecf0 not found: ID does not exist" containerID="027886b4b8f6fe219a69c064cf00ef69c1de730b57a31c7ce749d311d99fecf0" Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.789579 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"027886b4b8f6fe219a69c064cf00ef69c1de730b57a31c7ce749d311d99fecf0"} err="failed to get container status \"027886b4b8f6fe219a69c064cf00ef69c1de730b57a31c7ce749d311d99fecf0\": rpc error: code = NotFound desc = could not find container \"027886b4b8f6fe219a69c064cf00ef69c1de730b57a31c7ce749d311d99fecf0\": container with ID starting with 027886b4b8f6fe219a69c064cf00ef69c1de730b57a31c7ce749d311d99fecf0 not found: ID does not exist" Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.789594 5036 scope.go:117] "RemoveContainer" containerID="b63b6ac742cea22e2ee4cb05ff52e640d0f607b00b2692fe7e9014cbc39179b7" Jan 10 17:03:51 crc kubenswrapper[5036]: E0110 17:03:51.790150 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b63b6ac742cea22e2ee4cb05ff52e640d0f607b00b2692fe7e9014cbc39179b7\": container with ID starting with b63b6ac742cea22e2ee4cb05ff52e640d0f607b00b2692fe7e9014cbc39179b7 not found: ID does not exist" containerID="b63b6ac742cea22e2ee4cb05ff52e640d0f607b00b2692fe7e9014cbc39179b7" Jan 10 17:03:51 crc kubenswrapper[5036]: I0110 17:03:51.790178 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b63b6ac742cea22e2ee4cb05ff52e640d0f607b00b2692fe7e9014cbc39179b7"} err="failed to get container status \"b63b6ac742cea22e2ee4cb05ff52e640d0f607b00b2692fe7e9014cbc39179b7\": rpc error: code = NotFound desc = could not find container \"b63b6ac742cea22e2ee4cb05ff52e640d0f607b00b2692fe7e9014cbc39179b7\": container with ID starting with b63b6ac742cea22e2ee4cb05ff52e640d0f607b00b2692fe7e9014cbc39179b7 not found: ID does not exist" Jan 10 17:03:52 crc kubenswrapper[5036]: I0110 17:03:52.524943 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1835af21-debc-47a7-a7a8-05795ea58a44" path="/var/lib/kubelet/pods/1835af21-debc-47a7-a7a8-05795ea58a44/volumes" Jan 10 17:03:55 crc kubenswrapper[5036]: I0110 17:03:55.904700 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 17:03:55 crc kubenswrapper[5036]: I0110 17:03:55.905075 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 17:04:03 crc kubenswrapper[5036]: I0110 17:04:03.800540 5036 generic.go:334] "Generic (PLEG): container finished" podID="bf597c03-b76a-445a-84d3-034d70ca102e" containerID="1af236997f0e330b3ee4af2be22c88567138c483d380c6d36aef6841829609c0" exitCode=0 Jan 10 17:04:03 crc kubenswrapper[5036]: I0110 17:04:03.800572 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4gng" event={"ID":"bf597c03-b76a-445a-84d3-034d70ca102e","Type":"ContainerDied","Data":"1af236997f0e330b3ee4af2be22c88567138c483d380c6d36aef6841829609c0"} Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.224002 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4gng" Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.309910 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf597c03-b76a-445a-84d3-034d70ca102e-inventory\") pod \"bf597c03-b76a-445a-84d3-034d70ca102e\" (UID: \"bf597c03-b76a-445a-84d3-034d70ca102e\") " Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.310009 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf597c03-b76a-445a-84d3-034d70ca102e-ssh-key-openstack-edpm-ipam\") pod \"bf597c03-b76a-445a-84d3-034d70ca102e\" (UID: \"bf597c03-b76a-445a-84d3-034d70ca102e\") " Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.310135 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdjdm\" (UniqueName: \"kubernetes.io/projected/bf597c03-b76a-445a-84d3-034d70ca102e-kube-api-access-hdjdm\") pod \"bf597c03-b76a-445a-84d3-034d70ca102e\" (UID: \"bf597c03-b76a-445a-84d3-034d70ca102e\") " Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.310180 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bf597c03-b76a-445a-84d3-034d70ca102e-ceph\") pod \"bf597c03-b76a-445a-84d3-034d70ca102e\" (UID: \"bf597c03-b76a-445a-84d3-034d70ca102e\") " Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.315813 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf597c03-b76a-445a-84d3-034d70ca102e-ceph" (OuterVolumeSpecName: "ceph") pod "bf597c03-b76a-445a-84d3-034d70ca102e" (UID: "bf597c03-b76a-445a-84d3-034d70ca102e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.315912 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf597c03-b76a-445a-84d3-034d70ca102e-kube-api-access-hdjdm" (OuterVolumeSpecName: "kube-api-access-hdjdm") pod "bf597c03-b76a-445a-84d3-034d70ca102e" (UID: "bf597c03-b76a-445a-84d3-034d70ca102e"). InnerVolumeSpecName "kube-api-access-hdjdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.335775 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf597c03-b76a-445a-84d3-034d70ca102e-inventory" (OuterVolumeSpecName: "inventory") pod "bf597c03-b76a-445a-84d3-034d70ca102e" (UID: "bf597c03-b76a-445a-84d3-034d70ca102e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.337074 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf597c03-b76a-445a-84d3-034d70ca102e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bf597c03-b76a-445a-84d3-034d70ca102e" (UID: "bf597c03-b76a-445a-84d3-034d70ca102e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.412076 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bf597c03-b76a-445a-84d3-034d70ca102e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.412118 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdjdm\" (UniqueName: \"kubernetes.io/projected/bf597c03-b76a-445a-84d3-034d70ca102e-kube-api-access-hdjdm\") on node \"crc\" DevicePath \"\"" Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.412130 5036 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bf597c03-b76a-445a-84d3-034d70ca102e-ceph\") on node \"crc\" DevicePath \"\"" Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.412142 5036 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf597c03-b76a-445a-84d3-034d70ca102e-inventory\") on node \"crc\" DevicePath \"\"" Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.823437 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4gng" event={"ID":"bf597c03-b76a-445a-84d3-034d70ca102e","Type":"ContainerDied","Data":"7f9a93d9be437f4fdcda8e28c7700d6c1cb743b94847b4333b36053f2fcc4dea"} Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.823474 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f9a93d9be437f4fdcda8e28c7700d6c1cb743b94847b4333b36053f2fcc4dea" Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.823524 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-b4gng" Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.917773 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jw6dr"] Jan 10 17:04:05 crc kubenswrapper[5036]: E0110 17:04:05.919460 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1835af21-debc-47a7-a7a8-05795ea58a44" containerName="extract-content" Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.919503 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="1835af21-debc-47a7-a7a8-05795ea58a44" containerName="extract-content" Jan 10 17:04:05 crc kubenswrapper[5036]: E0110 17:04:05.919530 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1835af21-debc-47a7-a7a8-05795ea58a44" containerName="extract-utilities" Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.919542 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="1835af21-debc-47a7-a7a8-05795ea58a44" containerName="extract-utilities" Jan 10 17:04:05 crc kubenswrapper[5036]: E0110 17:04:05.919558 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1835af21-debc-47a7-a7a8-05795ea58a44" containerName="registry-server" Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.919568 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="1835af21-debc-47a7-a7a8-05795ea58a44" containerName="registry-server" Jan 10 17:04:05 crc kubenswrapper[5036]: E0110 17:04:05.919595 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf597c03-b76a-445a-84d3-034d70ca102e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.919608 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf597c03-b76a-445a-84d3-034d70ca102e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.919878 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf597c03-b76a-445a-84d3-034d70ca102e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.919897 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="1835af21-debc-47a7-a7a8-05795ea58a44" containerName="registry-server" Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.920641 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jw6dr" Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.924671 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.925017 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.926476 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.926608 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.928949 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 17:04:05 crc kubenswrapper[5036]: I0110 17:04:05.934351 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jw6dr"] Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.022808 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/35cc2e15-b6d3-419a-b719-1fcee66ce1b5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jw6dr\" (UID: \"35cc2e15-b6d3-419a-b719-1fcee66ce1b5\") " pod="openstack/ssh-known-hosts-edpm-deployment-jw6dr" Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.022943 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35cc2e15-b6d3-419a-b719-1fcee66ce1b5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jw6dr\" (UID: \"35cc2e15-b6d3-419a-b719-1fcee66ce1b5\") " pod="openstack/ssh-known-hosts-edpm-deployment-jw6dr" Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.023129 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/35cc2e15-b6d3-419a-b719-1fcee66ce1b5-ceph\") pod \"ssh-known-hosts-edpm-deployment-jw6dr\" (UID: \"35cc2e15-b6d3-419a-b719-1fcee66ce1b5\") " pod="openstack/ssh-known-hosts-edpm-deployment-jw6dr" Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.023215 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzpdw\" (UniqueName: \"kubernetes.io/projected/35cc2e15-b6d3-419a-b719-1fcee66ce1b5-kube-api-access-nzpdw\") pod \"ssh-known-hosts-edpm-deployment-jw6dr\" (UID: \"35cc2e15-b6d3-419a-b719-1fcee66ce1b5\") " pod="openstack/ssh-known-hosts-edpm-deployment-jw6dr" Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.125923 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35cc2e15-b6d3-419a-b719-1fcee66ce1b5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jw6dr\" (UID: \"35cc2e15-b6d3-419a-b719-1fcee66ce1b5\") " pod="openstack/ssh-known-hosts-edpm-deployment-jw6dr" Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.126072 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/35cc2e15-b6d3-419a-b719-1fcee66ce1b5-ceph\") pod \"ssh-known-hosts-edpm-deployment-jw6dr\" (UID: \"35cc2e15-b6d3-419a-b719-1fcee66ce1b5\") " pod="openstack/ssh-known-hosts-edpm-deployment-jw6dr" Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.126169 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzpdw\" (UniqueName: \"kubernetes.io/projected/35cc2e15-b6d3-419a-b719-1fcee66ce1b5-kube-api-access-nzpdw\") pod \"ssh-known-hosts-edpm-deployment-jw6dr\" (UID: \"35cc2e15-b6d3-419a-b719-1fcee66ce1b5\") " pod="openstack/ssh-known-hosts-edpm-deployment-jw6dr" Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.126389 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/35cc2e15-b6d3-419a-b719-1fcee66ce1b5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jw6dr\" (UID: \"35cc2e15-b6d3-419a-b719-1fcee66ce1b5\") " pod="openstack/ssh-known-hosts-edpm-deployment-jw6dr" Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.130858 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35cc2e15-b6d3-419a-b719-1fcee66ce1b5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jw6dr\" (UID: \"35cc2e15-b6d3-419a-b719-1fcee66ce1b5\") " pod="openstack/ssh-known-hosts-edpm-deployment-jw6dr" Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.130878 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/35cc2e15-b6d3-419a-b719-1fcee66ce1b5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jw6dr\" (UID: \"35cc2e15-b6d3-419a-b719-1fcee66ce1b5\") " pod="openstack/ssh-known-hosts-edpm-deployment-jw6dr" Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.136346 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/35cc2e15-b6d3-419a-b719-1fcee66ce1b5-ceph\") pod \"ssh-known-hosts-edpm-deployment-jw6dr\" (UID: \"35cc2e15-b6d3-419a-b719-1fcee66ce1b5\") " pod="openstack/ssh-known-hosts-edpm-deployment-jw6dr" Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.157317 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzpdw\" (UniqueName: \"kubernetes.io/projected/35cc2e15-b6d3-419a-b719-1fcee66ce1b5-kube-api-access-nzpdw\") pod \"ssh-known-hosts-edpm-deployment-jw6dr\" (UID: \"35cc2e15-b6d3-419a-b719-1fcee66ce1b5\") " pod="openstack/ssh-known-hosts-edpm-deployment-jw6dr" Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.235719 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jw6dr" Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.779897 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6n7w7"] Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.782833 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6n7w7" Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.795445 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6n7w7"] Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.840418 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fl56\" (UniqueName: \"kubernetes.io/projected/1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f-kube-api-access-8fl56\") pod \"community-operators-6n7w7\" (UID: \"1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f\") " pod="openshift-marketplace/community-operators-6n7w7" Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.840599 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f-utilities\") pod \"community-operators-6n7w7\" (UID: \"1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f\") " pod="openshift-marketplace/community-operators-6n7w7" Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.840659 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f-catalog-content\") pod \"community-operators-6n7w7\" (UID: \"1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f\") " pod="openshift-marketplace/community-operators-6n7w7" Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.851071 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jw6dr" event={"ID":"35cc2e15-b6d3-419a-b719-1fcee66ce1b5","Type":"ContainerStarted","Data":"0c699651b3efe8b6f344ddbb1eeda94c044806906ca6392e2b4ed3a215d1975d"} Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.861928 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jw6dr"] Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.942230 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f-utilities\") pod \"community-operators-6n7w7\" (UID: \"1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f\") " pod="openshift-marketplace/community-operators-6n7w7" Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.942292 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f-catalog-content\") pod \"community-operators-6n7w7\" (UID: \"1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f\") " pod="openshift-marketplace/community-operators-6n7w7" Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.942359 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fl56\" (UniqueName: \"kubernetes.io/projected/1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f-kube-api-access-8fl56\") pod \"community-operators-6n7w7\" (UID: \"1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f\") " pod="openshift-marketplace/community-operators-6n7w7" Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.942867 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f-catalog-content\") pod \"community-operators-6n7w7\" (UID: \"1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f\") " pod="openshift-marketplace/community-operators-6n7w7" Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.942930 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f-utilities\") pod \"community-operators-6n7w7\" (UID: \"1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f\") " pod="openshift-marketplace/community-operators-6n7w7" Jan 10 17:04:06 crc kubenswrapper[5036]: I0110 17:04:06.962542 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fl56\" (UniqueName: \"kubernetes.io/projected/1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f-kube-api-access-8fl56\") pod \"community-operators-6n7w7\" (UID: \"1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f\") " pod="openshift-marketplace/community-operators-6n7w7" Jan 10 17:04:07 crc kubenswrapper[5036]: I0110 17:04:07.117023 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6n7w7" Jan 10 17:04:07 crc kubenswrapper[5036]: I0110 17:04:07.677095 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6n7w7"] Jan 10 17:04:07 crc kubenswrapper[5036]: I0110 17:04:07.862043 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jw6dr" event={"ID":"35cc2e15-b6d3-419a-b719-1fcee66ce1b5","Type":"ContainerStarted","Data":"a77af487c92e4616792e49129f24263f0c3e20ec0e92a5451993f1b5bd8c91f6"} Jan 10 17:04:07 crc kubenswrapper[5036]: I0110 17:04:07.864811 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6n7w7" event={"ID":"1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f","Type":"ContainerStarted","Data":"028b557d8e1794ac33839e73cf405136e3a89cb053d8750de9ffba9122017e6c"} Jan 10 17:04:07 crc kubenswrapper[5036]: I0110 17:04:07.864874 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6n7w7" event={"ID":"1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f","Type":"ContainerStarted","Data":"a0b6e338ca4ae875c766a2998ea549e98a976cc6c01a56dad6e1c3223a0151f1"} Jan 10 17:04:07 crc kubenswrapper[5036]: I0110 17:04:07.880659 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-jw6dr" podStartSLOduration=2.4014526849999998 podStartE2EDuration="2.880638016s" podCreationTimestamp="2026-01-10 17:04:05 +0000 UTC" firstStartedPulling="2026-01-10 17:04:06.837655241 +0000 UTC m=+2168.707890735" lastFinishedPulling="2026-01-10 17:04:07.316840572 +0000 UTC m=+2169.187076066" observedRunningTime="2026-01-10 17:04:07.880523493 +0000 UTC m=+2169.750758997" watchObservedRunningTime="2026-01-10 17:04:07.880638016 +0000 UTC m=+2169.750873530" Jan 10 17:04:08 crc kubenswrapper[5036]: I0110 17:04:08.877045 5036 generic.go:334] "Generic (PLEG): container finished" podID="1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f" containerID="028b557d8e1794ac33839e73cf405136e3a89cb053d8750de9ffba9122017e6c" exitCode=0 Jan 10 17:04:08 crc kubenswrapper[5036]: I0110 17:04:08.877118 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6n7w7" event={"ID":"1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f","Type":"ContainerDied","Data":"028b557d8e1794ac33839e73cf405136e3a89cb053d8750de9ffba9122017e6c"} Jan 10 17:04:09 crc kubenswrapper[5036]: I0110 17:04:09.889299 5036 generic.go:334] "Generic (PLEG): container finished" podID="1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f" containerID="d49a9566c228a746489cf31fb3e2624c0c45a9692d89072834e850ecca5ed420" exitCode=0 Jan 10 17:04:09 crc kubenswrapper[5036]: I0110 17:04:09.889404 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6n7w7" event={"ID":"1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f","Type":"ContainerDied","Data":"d49a9566c228a746489cf31fb3e2624c0c45a9692d89072834e850ecca5ed420"} Jan 10 17:04:10 crc kubenswrapper[5036]: I0110 17:04:10.897672 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6n7w7" event={"ID":"1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f","Type":"ContainerStarted","Data":"ef225c3119d85b1fb37abc328a023a84bfbeebad0555ccc768fe11091f94208a"} Jan 10 17:04:10 crc kubenswrapper[5036]: I0110 17:04:10.929757 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6n7w7" podStartSLOduration=2.418066155 podStartE2EDuration="4.929737843s" podCreationTimestamp="2026-01-10 17:04:06 +0000 UTC" firstStartedPulling="2026-01-10 17:04:07.867398537 +0000 UTC m=+2169.737634041" lastFinishedPulling="2026-01-10 17:04:10.379070235 +0000 UTC m=+2172.249305729" observedRunningTime="2026-01-10 17:04:10.912612563 +0000 UTC m=+2172.782848057" watchObservedRunningTime="2026-01-10 17:04:10.929737843 +0000 UTC m=+2172.799973337" Jan 10 17:04:16 crc kubenswrapper[5036]: I0110 17:04:16.946982 5036 generic.go:334] "Generic (PLEG): container finished" podID="35cc2e15-b6d3-419a-b719-1fcee66ce1b5" containerID="a77af487c92e4616792e49129f24263f0c3e20ec0e92a5451993f1b5bd8c91f6" exitCode=0 Jan 10 17:04:16 crc kubenswrapper[5036]: I0110 17:04:16.947042 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jw6dr" event={"ID":"35cc2e15-b6d3-419a-b719-1fcee66ce1b5","Type":"ContainerDied","Data":"a77af487c92e4616792e49129f24263f0c3e20ec0e92a5451993f1b5bd8c91f6"} Jan 10 17:04:17 crc kubenswrapper[5036]: I0110 17:04:17.117409 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6n7w7" Jan 10 17:04:17 crc kubenswrapper[5036]: I0110 17:04:17.117452 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6n7w7" Jan 10 17:04:17 crc kubenswrapper[5036]: I0110 17:04:17.180497 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6n7w7" Jan 10 17:04:18 crc kubenswrapper[5036]: I0110 17:04:18.026696 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6n7w7" Jan 10 17:04:18 crc kubenswrapper[5036]: I0110 17:04:18.083712 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6n7w7"] Jan 10 17:04:18 crc kubenswrapper[5036]: I0110 17:04:18.505058 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jw6dr" Jan 10 17:04:18 crc kubenswrapper[5036]: I0110 17:04:18.524765 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzpdw\" (UniqueName: \"kubernetes.io/projected/35cc2e15-b6d3-419a-b719-1fcee66ce1b5-kube-api-access-nzpdw\") pod \"35cc2e15-b6d3-419a-b719-1fcee66ce1b5\" (UID: \"35cc2e15-b6d3-419a-b719-1fcee66ce1b5\") " Jan 10 17:04:18 crc kubenswrapper[5036]: I0110 17:04:18.524842 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35cc2e15-b6d3-419a-b719-1fcee66ce1b5-ssh-key-openstack-edpm-ipam\") pod \"35cc2e15-b6d3-419a-b719-1fcee66ce1b5\" (UID: \"35cc2e15-b6d3-419a-b719-1fcee66ce1b5\") " Jan 10 17:04:18 crc kubenswrapper[5036]: I0110 17:04:18.524891 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/35cc2e15-b6d3-419a-b719-1fcee66ce1b5-ceph\") pod \"35cc2e15-b6d3-419a-b719-1fcee66ce1b5\" (UID: \"35cc2e15-b6d3-419a-b719-1fcee66ce1b5\") " Jan 10 17:04:18 crc kubenswrapper[5036]: I0110 17:04:18.524919 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/35cc2e15-b6d3-419a-b719-1fcee66ce1b5-inventory-0\") pod \"35cc2e15-b6d3-419a-b719-1fcee66ce1b5\" (UID: \"35cc2e15-b6d3-419a-b719-1fcee66ce1b5\") " Jan 10 17:04:18 crc kubenswrapper[5036]: I0110 17:04:18.556782 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35cc2e15-b6d3-419a-b719-1fcee66ce1b5-ceph" (OuterVolumeSpecName: "ceph") pod "35cc2e15-b6d3-419a-b719-1fcee66ce1b5" (UID: "35cc2e15-b6d3-419a-b719-1fcee66ce1b5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:04:18 crc kubenswrapper[5036]: I0110 17:04:18.560409 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35cc2e15-b6d3-419a-b719-1fcee66ce1b5-kube-api-access-nzpdw" (OuterVolumeSpecName: "kube-api-access-nzpdw") pod "35cc2e15-b6d3-419a-b719-1fcee66ce1b5" (UID: "35cc2e15-b6d3-419a-b719-1fcee66ce1b5"). InnerVolumeSpecName "kube-api-access-nzpdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:04:18 crc kubenswrapper[5036]: I0110 17:04:18.630021 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35cc2e15-b6d3-419a-b719-1fcee66ce1b5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "35cc2e15-b6d3-419a-b719-1fcee66ce1b5" (UID: "35cc2e15-b6d3-419a-b719-1fcee66ce1b5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:04:18 crc kubenswrapper[5036]: I0110 17:04:18.633954 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzpdw\" (UniqueName: \"kubernetes.io/projected/35cc2e15-b6d3-419a-b719-1fcee66ce1b5-kube-api-access-nzpdw\") on node \"crc\" DevicePath \"\"" Jan 10 17:04:18 crc kubenswrapper[5036]: I0110 17:04:18.633994 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35cc2e15-b6d3-419a-b719-1fcee66ce1b5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 17:04:18 crc kubenswrapper[5036]: I0110 17:04:18.634038 5036 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/35cc2e15-b6d3-419a-b719-1fcee66ce1b5-ceph\") on node \"crc\" DevicePath \"\"" Jan 10 17:04:18 crc kubenswrapper[5036]: I0110 17:04:18.660013 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35cc2e15-b6d3-419a-b719-1fcee66ce1b5-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "35cc2e15-b6d3-419a-b719-1fcee66ce1b5" (UID: "35cc2e15-b6d3-419a-b719-1fcee66ce1b5"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:04:18 crc kubenswrapper[5036]: I0110 17:04:18.735495 5036 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/35cc2e15-b6d3-419a-b719-1fcee66ce1b5-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 10 17:04:18 crc kubenswrapper[5036]: I0110 17:04:18.968015 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jw6dr" Jan 10 17:04:18 crc kubenswrapper[5036]: I0110 17:04:18.968002 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jw6dr" event={"ID":"35cc2e15-b6d3-419a-b719-1fcee66ce1b5","Type":"ContainerDied","Data":"0c699651b3efe8b6f344ddbb1eeda94c044806906ca6392e2b4ed3a215d1975d"} Jan 10 17:04:18 crc kubenswrapper[5036]: I0110 17:04:18.968741 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c699651b3efe8b6f344ddbb1eeda94c044806906ca6392e2b4ed3a215d1975d" Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.063566 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-mw76h"] Jan 10 17:04:19 crc kubenswrapper[5036]: E0110 17:04:19.064248 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35cc2e15-b6d3-419a-b719-1fcee66ce1b5" containerName="ssh-known-hosts-edpm-deployment" Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.064266 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="35cc2e15-b6d3-419a-b719-1fcee66ce1b5" containerName="ssh-known-hosts-edpm-deployment" Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.064521 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="35cc2e15-b6d3-419a-b719-1fcee66ce1b5" containerName="ssh-known-hosts-edpm-deployment" Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.068605 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mw76h" Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.085361 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.085791 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.085823 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.086749 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.088723 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.090472 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-mw76h"] Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.247860 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5kx9\" (UniqueName: \"kubernetes.io/projected/956e3be3-ef01-423c-a80d-1b6c517aee91-kube-api-access-p5kx9\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mw76h\" (UID: \"956e3be3-ef01-423c-a80d-1b6c517aee91\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mw76h" Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.248017 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/956e3be3-ef01-423c-a80d-1b6c517aee91-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mw76h\" (UID: \"956e3be3-ef01-423c-a80d-1b6c517aee91\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mw76h" Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.248101 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/956e3be3-ef01-423c-a80d-1b6c517aee91-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mw76h\" (UID: \"956e3be3-ef01-423c-a80d-1b6c517aee91\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mw76h" Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.248203 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/956e3be3-ef01-423c-a80d-1b6c517aee91-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mw76h\" (UID: \"956e3be3-ef01-423c-a80d-1b6c517aee91\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mw76h" Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.349292 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/956e3be3-ef01-423c-a80d-1b6c517aee91-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mw76h\" (UID: \"956e3be3-ef01-423c-a80d-1b6c517aee91\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mw76h" Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.349387 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5kx9\" (UniqueName: \"kubernetes.io/projected/956e3be3-ef01-423c-a80d-1b6c517aee91-kube-api-access-p5kx9\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mw76h\" (UID: \"956e3be3-ef01-423c-a80d-1b6c517aee91\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mw76h" Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.349432 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/956e3be3-ef01-423c-a80d-1b6c517aee91-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mw76h\" (UID: \"956e3be3-ef01-423c-a80d-1b6c517aee91\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mw76h" Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.349481 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/956e3be3-ef01-423c-a80d-1b6c517aee91-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mw76h\" (UID: \"956e3be3-ef01-423c-a80d-1b6c517aee91\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mw76h" Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.354645 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/956e3be3-ef01-423c-a80d-1b6c517aee91-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mw76h\" (UID: \"956e3be3-ef01-423c-a80d-1b6c517aee91\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mw76h" Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.357147 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/956e3be3-ef01-423c-a80d-1b6c517aee91-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mw76h\" (UID: \"956e3be3-ef01-423c-a80d-1b6c517aee91\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mw76h" Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.358309 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/956e3be3-ef01-423c-a80d-1b6c517aee91-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mw76h\" (UID: \"956e3be3-ef01-423c-a80d-1b6c517aee91\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mw76h" Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.382226 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5kx9\" (UniqueName: \"kubernetes.io/projected/956e3be3-ef01-423c-a80d-1b6c517aee91-kube-api-access-p5kx9\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mw76h\" (UID: \"956e3be3-ef01-423c-a80d-1b6c517aee91\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mw76h" Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.417648 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mw76h" Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.971496 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-mw76h"] Jan 10 17:04:19 crc kubenswrapper[5036]: I0110 17:04:19.990516 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6n7w7" podUID="1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f" containerName="registry-server" containerID="cri-o://ef225c3119d85b1fb37abc328a023a84bfbeebad0555ccc768fe11091f94208a" gracePeriod=2 Jan 10 17:04:20 crc kubenswrapper[5036]: I0110 17:04:20.918415 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6n7w7" Jan 10 17:04:20 crc kubenswrapper[5036]: I0110 17:04:20.999737 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mw76h" event={"ID":"956e3be3-ef01-423c-a80d-1b6c517aee91","Type":"ContainerStarted","Data":"9b59f3184858bb5ffa7449c24a98885e44bf4f1188c2fd426f8f3ade8e006b7d"} Jan 10 17:04:20 crc kubenswrapper[5036]: I0110 17:04:20.999789 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mw76h" event={"ID":"956e3be3-ef01-423c-a80d-1b6c517aee91","Type":"ContainerStarted","Data":"f7aa69f298d10f46dc49ea9da83b47dfef9105c5235647f2daf3cc2bd31fb2a3"} Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.003214 5036 generic.go:334] "Generic (PLEG): container finished" podID="1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f" containerID="ef225c3119d85b1fb37abc328a023a84bfbeebad0555ccc768fe11091f94208a" exitCode=0 Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.003241 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6n7w7" event={"ID":"1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f","Type":"ContainerDied","Data":"ef225c3119d85b1fb37abc328a023a84bfbeebad0555ccc768fe11091f94208a"} Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.003257 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6n7w7" event={"ID":"1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f","Type":"ContainerDied","Data":"a0b6e338ca4ae875c766a2998ea549e98a976cc6c01a56dad6e1c3223a0151f1"} Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.003274 5036 scope.go:117] "RemoveContainer" containerID="ef225c3119d85b1fb37abc328a023a84bfbeebad0555ccc768fe11091f94208a" Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.003383 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6n7w7" Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.021550 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mw76h" podStartSLOduration=1.5679396620000001 podStartE2EDuration="2.021416397s" podCreationTimestamp="2026-01-10 17:04:19 +0000 UTC" firstStartedPulling="2026-01-10 17:04:19.983193688 +0000 UTC m=+2181.853429192" lastFinishedPulling="2026-01-10 17:04:20.436670423 +0000 UTC m=+2182.306905927" observedRunningTime="2026-01-10 17:04:21.013425078 +0000 UTC m=+2182.883660602" watchObservedRunningTime="2026-01-10 17:04:21.021416397 +0000 UTC m=+2182.891651911" Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.030773 5036 scope.go:117] "RemoveContainer" containerID="d49a9566c228a746489cf31fb3e2624c0c45a9692d89072834e850ecca5ed420" Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.056797 5036 scope.go:117] "RemoveContainer" containerID="028b557d8e1794ac33839e73cf405136e3a89cb053d8750de9ffba9122017e6c" Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.075972 5036 scope.go:117] "RemoveContainer" containerID="ef225c3119d85b1fb37abc328a023a84bfbeebad0555ccc768fe11091f94208a" Jan 10 17:04:21 crc kubenswrapper[5036]: E0110 17:04:21.076563 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef225c3119d85b1fb37abc328a023a84bfbeebad0555ccc768fe11091f94208a\": container with ID starting with ef225c3119d85b1fb37abc328a023a84bfbeebad0555ccc768fe11091f94208a not found: ID does not exist" containerID="ef225c3119d85b1fb37abc328a023a84bfbeebad0555ccc768fe11091f94208a" Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.076601 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef225c3119d85b1fb37abc328a023a84bfbeebad0555ccc768fe11091f94208a"} err="failed to get container status \"ef225c3119d85b1fb37abc328a023a84bfbeebad0555ccc768fe11091f94208a\": rpc error: code = NotFound desc = could not find container \"ef225c3119d85b1fb37abc328a023a84bfbeebad0555ccc768fe11091f94208a\": container with ID starting with ef225c3119d85b1fb37abc328a023a84bfbeebad0555ccc768fe11091f94208a not found: ID does not exist" Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.076625 5036 scope.go:117] "RemoveContainer" containerID="d49a9566c228a746489cf31fb3e2624c0c45a9692d89072834e850ecca5ed420" Jan 10 17:04:21 crc kubenswrapper[5036]: E0110 17:04:21.084262 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d49a9566c228a746489cf31fb3e2624c0c45a9692d89072834e850ecca5ed420\": container with ID starting with d49a9566c228a746489cf31fb3e2624c0c45a9692d89072834e850ecca5ed420 not found: ID does not exist" containerID="d49a9566c228a746489cf31fb3e2624c0c45a9692d89072834e850ecca5ed420" Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.084312 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d49a9566c228a746489cf31fb3e2624c0c45a9692d89072834e850ecca5ed420"} err="failed to get container status \"d49a9566c228a746489cf31fb3e2624c0c45a9692d89072834e850ecca5ed420\": rpc error: code = NotFound desc = could not find container \"d49a9566c228a746489cf31fb3e2624c0c45a9692d89072834e850ecca5ed420\": container with ID starting with d49a9566c228a746489cf31fb3e2624c0c45a9692d89072834e850ecca5ed420 not found: ID does not exist" Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.084348 5036 scope.go:117] "RemoveContainer" containerID="028b557d8e1794ac33839e73cf405136e3a89cb053d8750de9ffba9122017e6c" Jan 10 17:04:21 crc kubenswrapper[5036]: E0110 17:04:21.084657 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"028b557d8e1794ac33839e73cf405136e3a89cb053d8750de9ffba9122017e6c\": container with ID starting with 028b557d8e1794ac33839e73cf405136e3a89cb053d8750de9ffba9122017e6c not found: ID does not exist" containerID="028b557d8e1794ac33839e73cf405136e3a89cb053d8750de9ffba9122017e6c" Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.084704 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"028b557d8e1794ac33839e73cf405136e3a89cb053d8750de9ffba9122017e6c"} err="failed to get container status \"028b557d8e1794ac33839e73cf405136e3a89cb053d8750de9ffba9122017e6c\": rpc error: code = NotFound desc = could not find container \"028b557d8e1794ac33839e73cf405136e3a89cb053d8750de9ffba9122017e6c\": container with ID starting with 028b557d8e1794ac33839e73cf405136e3a89cb053d8750de9ffba9122017e6c not found: ID does not exist" Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.090091 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f-utilities\") pod \"1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f\" (UID: \"1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f\") " Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.090284 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fl56\" (UniqueName: \"kubernetes.io/projected/1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f-kube-api-access-8fl56\") pod \"1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f\" (UID: \"1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f\") " Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.090342 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f-catalog-content\") pod \"1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f\" (UID: \"1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f\") " Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.090987 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f-utilities" (OuterVolumeSpecName: "utilities") pod "1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f" (UID: "1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.091099 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.096640 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f-kube-api-access-8fl56" (OuterVolumeSpecName: "kube-api-access-8fl56") pod "1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f" (UID: "1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f"). InnerVolumeSpecName "kube-api-access-8fl56". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.143771 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f" (UID: "1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.192469 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fl56\" (UniqueName: \"kubernetes.io/projected/1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f-kube-api-access-8fl56\") on node \"crc\" DevicePath \"\"" Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.192512 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.368517 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6n7w7"] Jan 10 17:04:21 crc kubenswrapper[5036]: I0110 17:04:21.374355 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6n7w7"] Jan 10 17:04:22 crc kubenswrapper[5036]: I0110 17:04:22.528423 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f" path="/var/lib/kubelet/pods/1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f/volumes" Jan 10 17:04:25 crc kubenswrapper[5036]: I0110 17:04:25.904123 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 17:04:25 crc kubenswrapper[5036]: I0110 17:04:25.904540 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 17:04:25 crc kubenswrapper[5036]: I0110 17:04:25.904590 5036 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 17:04:25 crc kubenswrapper[5036]: I0110 17:04:25.905315 5036 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9109549e278f48da54c19e23f5b37bdb271c9f61a90632945b7ebb3b8d6064d5"} pod="openshift-machine-config-operator/machine-config-daemon-kqphb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 10 17:04:25 crc kubenswrapper[5036]: I0110 17:04:25.905365 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" containerID="cri-o://9109549e278f48da54c19e23f5b37bdb271c9f61a90632945b7ebb3b8d6064d5" gracePeriod=600 Jan 10 17:04:26 crc kubenswrapper[5036]: I0110 17:04:26.063647 5036 generic.go:334] "Generic (PLEG): container finished" podID="79756361-741e-4470-831b-6ee092bc6277" containerID="9109549e278f48da54c19e23f5b37bdb271c9f61a90632945b7ebb3b8d6064d5" exitCode=0 Jan 10 17:04:26 crc kubenswrapper[5036]: I0110 17:04:26.063727 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerDied","Data":"9109549e278f48da54c19e23f5b37bdb271c9f61a90632945b7ebb3b8d6064d5"} Jan 10 17:04:26 crc kubenswrapper[5036]: I0110 17:04:26.064605 5036 scope.go:117] "RemoveContainer" containerID="4cfe9bf945af886b43320632cc61b871fc0801de3a562fa7db95bb30ff540219" Jan 10 17:04:27 crc kubenswrapper[5036]: I0110 17:04:27.079904 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerStarted","Data":"d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af"} Jan 10 17:04:28 crc kubenswrapper[5036]: I0110 17:04:28.097617 5036 generic.go:334] "Generic (PLEG): container finished" podID="956e3be3-ef01-423c-a80d-1b6c517aee91" containerID="9b59f3184858bb5ffa7449c24a98885e44bf4f1188c2fd426f8f3ade8e006b7d" exitCode=0 Jan 10 17:04:28 crc kubenswrapper[5036]: I0110 17:04:28.099109 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mw76h" event={"ID":"956e3be3-ef01-423c-a80d-1b6c517aee91","Type":"ContainerDied","Data":"9b59f3184858bb5ffa7449c24a98885e44bf4f1188c2fd426f8f3ade8e006b7d"} Jan 10 17:04:29 crc kubenswrapper[5036]: I0110 17:04:29.541865 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mw76h" Jan 10 17:04:29 crc kubenswrapper[5036]: I0110 17:04:29.636328 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/956e3be3-ef01-423c-a80d-1b6c517aee91-inventory\") pod \"956e3be3-ef01-423c-a80d-1b6c517aee91\" (UID: \"956e3be3-ef01-423c-a80d-1b6c517aee91\") " Jan 10 17:04:29 crc kubenswrapper[5036]: I0110 17:04:29.636529 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5kx9\" (UniqueName: \"kubernetes.io/projected/956e3be3-ef01-423c-a80d-1b6c517aee91-kube-api-access-p5kx9\") pod \"956e3be3-ef01-423c-a80d-1b6c517aee91\" (UID: \"956e3be3-ef01-423c-a80d-1b6c517aee91\") " Jan 10 17:04:29 crc kubenswrapper[5036]: I0110 17:04:29.636584 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/956e3be3-ef01-423c-a80d-1b6c517aee91-ceph\") pod \"956e3be3-ef01-423c-a80d-1b6c517aee91\" (UID: \"956e3be3-ef01-423c-a80d-1b6c517aee91\") " Jan 10 17:04:29 crc kubenswrapper[5036]: I0110 17:04:29.636614 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/956e3be3-ef01-423c-a80d-1b6c517aee91-ssh-key-openstack-edpm-ipam\") pod \"956e3be3-ef01-423c-a80d-1b6c517aee91\" (UID: \"956e3be3-ef01-423c-a80d-1b6c517aee91\") " Jan 10 17:04:29 crc kubenswrapper[5036]: I0110 17:04:29.650881 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/956e3be3-ef01-423c-a80d-1b6c517aee91-ceph" (OuterVolumeSpecName: "ceph") pod "956e3be3-ef01-423c-a80d-1b6c517aee91" (UID: "956e3be3-ef01-423c-a80d-1b6c517aee91"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:04:29 crc kubenswrapper[5036]: I0110 17:04:29.650893 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/956e3be3-ef01-423c-a80d-1b6c517aee91-kube-api-access-p5kx9" (OuterVolumeSpecName: "kube-api-access-p5kx9") pod "956e3be3-ef01-423c-a80d-1b6c517aee91" (UID: "956e3be3-ef01-423c-a80d-1b6c517aee91"). InnerVolumeSpecName "kube-api-access-p5kx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:04:29 crc kubenswrapper[5036]: I0110 17:04:29.667449 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/956e3be3-ef01-423c-a80d-1b6c517aee91-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "956e3be3-ef01-423c-a80d-1b6c517aee91" (UID: "956e3be3-ef01-423c-a80d-1b6c517aee91"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:04:29 crc kubenswrapper[5036]: I0110 17:04:29.671790 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/956e3be3-ef01-423c-a80d-1b6c517aee91-inventory" (OuterVolumeSpecName: "inventory") pod "956e3be3-ef01-423c-a80d-1b6c517aee91" (UID: "956e3be3-ef01-423c-a80d-1b6c517aee91"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:04:29 crc kubenswrapper[5036]: I0110 17:04:29.739114 5036 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/956e3be3-ef01-423c-a80d-1b6c517aee91-inventory\") on node \"crc\" DevicePath \"\"" Jan 10 17:04:29 crc kubenswrapper[5036]: I0110 17:04:29.739160 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5kx9\" (UniqueName: \"kubernetes.io/projected/956e3be3-ef01-423c-a80d-1b6c517aee91-kube-api-access-p5kx9\") on node \"crc\" DevicePath \"\"" Jan 10 17:04:29 crc kubenswrapper[5036]: I0110 17:04:29.739176 5036 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/956e3be3-ef01-423c-a80d-1b6c517aee91-ceph\") on node \"crc\" DevicePath \"\"" Jan 10 17:04:29 crc kubenswrapper[5036]: I0110 17:04:29.739188 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/956e3be3-ef01-423c-a80d-1b6c517aee91-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.117833 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mw76h" event={"ID":"956e3be3-ef01-423c-a80d-1b6c517aee91","Type":"ContainerDied","Data":"f7aa69f298d10f46dc49ea9da83b47dfef9105c5235647f2daf3cc2bd31fb2a3"} Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.117871 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7aa69f298d10f46dc49ea9da83b47dfef9105c5235647f2daf3cc2bd31fb2a3" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.117903 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mw76h" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.198435 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f"] Jan 10 17:04:30 crc kubenswrapper[5036]: E0110 17:04:30.198899 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956e3be3-ef01-423c-a80d-1b6c517aee91" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.198922 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="956e3be3-ef01-423c-a80d-1b6c517aee91" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 10 17:04:30 crc kubenswrapper[5036]: E0110 17:04:30.198945 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f" containerName="registry-server" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.198952 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f" containerName="registry-server" Jan 10 17:04:30 crc kubenswrapper[5036]: E0110 17:04:30.198969 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f" containerName="extract-utilities" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.198977 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f" containerName="extract-utilities" Jan 10 17:04:30 crc kubenswrapper[5036]: E0110 17:04:30.198989 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f" containerName="extract-content" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.198996 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f" containerName="extract-content" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.199207 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c38c16b-cbc8-4000-bffa-6be8b3cbbe3f" containerName="registry-server" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.199231 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="956e3be3-ef01-423c-a80d-1b6c517aee91" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.200006 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.202548 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.204332 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.204528 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.204927 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.205104 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.218191 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f"] Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.368493 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnx5k\" (UniqueName: \"kubernetes.io/projected/f551e3a3-cdf6-4fc6-8452-869afe1cef86-kube-api-access-mnx5k\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f\" (UID: \"f551e3a3-cdf6-4fc6-8452-869afe1cef86\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.368853 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f551e3a3-cdf6-4fc6-8452-869afe1cef86-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f\" (UID: \"f551e3a3-cdf6-4fc6-8452-869afe1cef86\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.368902 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f551e3a3-cdf6-4fc6-8452-869afe1cef86-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f\" (UID: \"f551e3a3-cdf6-4fc6-8452-869afe1cef86\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.368922 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f551e3a3-cdf6-4fc6-8452-869afe1cef86-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f\" (UID: \"f551e3a3-cdf6-4fc6-8452-869afe1cef86\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.470507 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f551e3a3-cdf6-4fc6-8452-869afe1cef86-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f\" (UID: \"f551e3a3-cdf6-4fc6-8452-869afe1cef86\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.470559 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f551e3a3-cdf6-4fc6-8452-869afe1cef86-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f\" (UID: \"f551e3a3-cdf6-4fc6-8452-869afe1cef86\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.470668 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnx5k\" (UniqueName: \"kubernetes.io/projected/f551e3a3-cdf6-4fc6-8452-869afe1cef86-kube-api-access-mnx5k\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f\" (UID: \"f551e3a3-cdf6-4fc6-8452-869afe1cef86\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.470736 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f551e3a3-cdf6-4fc6-8452-869afe1cef86-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f\" (UID: \"f551e3a3-cdf6-4fc6-8452-869afe1cef86\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.476122 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f551e3a3-cdf6-4fc6-8452-869afe1cef86-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f\" (UID: \"f551e3a3-cdf6-4fc6-8452-869afe1cef86\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.476647 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f551e3a3-cdf6-4fc6-8452-869afe1cef86-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f\" (UID: \"f551e3a3-cdf6-4fc6-8452-869afe1cef86\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.483746 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f551e3a3-cdf6-4fc6-8452-869afe1cef86-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f\" (UID: \"f551e3a3-cdf6-4fc6-8452-869afe1cef86\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.486200 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnx5k\" (UniqueName: \"kubernetes.io/projected/f551e3a3-cdf6-4fc6-8452-869afe1cef86-kube-api-access-mnx5k\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f\" (UID: \"f551e3a3-cdf6-4fc6-8452-869afe1cef86\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f" Jan 10 17:04:30 crc kubenswrapper[5036]: I0110 17:04:30.530514 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f" Jan 10 17:04:31 crc kubenswrapper[5036]: I0110 17:04:31.110884 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f"] Jan 10 17:04:31 crc kubenswrapper[5036]: W0110 17:04:31.122644 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf551e3a3_cdf6_4fc6_8452_869afe1cef86.slice/crio-b2df03f3338cbb9839c6574f0e5b1dbae6db63b9e6da0668a12ccd19dea6b254 WatchSource:0}: Error finding container b2df03f3338cbb9839c6574f0e5b1dbae6db63b9e6da0668a12ccd19dea6b254: Status 404 returned error can't find the container with id b2df03f3338cbb9839c6574f0e5b1dbae6db63b9e6da0668a12ccd19dea6b254 Jan 10 17:04:31 crc kubenswrapper[5036]: I0110 17:04:31.138637 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f" event={"ID":"f551e3a3-cdf6-4fc6-8452-869afe1cef86","Type":"ContainerStarted","Data":"b2df03f3338cbb9839c6574f0e5b1dbae6db63b9e6da0668a12ccd19dea6b254"} Jan 10 17:04:32 crc kubenswrapper[5036]: I0110 17:04:32.149294 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f" event={"ID":"f551e3a3-cdf6-4fc6-8452-869afe1cef86","Type":"ContainerStarted","Data":"0181fabcd6cfd84f6657e328fb26b1af3090bdd447e54b7717444b1d6a234149"} Jan 10 17:04:32 crc kubenswrapper[5036]: I0110 17:04:32.173188 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f" podStartSLOduration=1.64459087 podStartE2EDuration="2.173155205s" podCreationTimestamp="2026-01-10 17:04:30 +0000 UTC" firstStartedPulling="2026-01-10 17:04:31.130344345 +0000 UTC m=+2193.000579839" lastFinishedPulling="2026-01-10 17:04:31.65890867 +0000 UTC m=+2193.529144174" observedRunningTime="2026-01-10 17:04:32.172598859 +0000 UTC m=+2194.042834393" watchObservedRunningTime="2026-01-10 17:04:32.173155205 +0000 UTC m=+2194.043390739" Jan 10 17:04:41 crc kubenswrapper[5036]: I0110 17:04:41.236466 5036 generic.go:334] "Generic (PLEG): container finished" podID="f551e3a3-cdf6-4fc6-8452-869afe1cef86" containerID="0181fabcd6cfd84f6657e328fb26b1af3090bdd447e54b7717444b1d6a234149" exitCode=0 Jan 10 17:04:41 crc kubenswrapper[5036]: I0110 17:04:41.236556 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f" event={"ID":"f551e3a3-cdf6-4fc6-8452-869afe1cef86","Type":"ContainerDied","Data":"0181fabcd6cfd84f6657e328fb26b1af3090bdd447e54b7717444b1d6a234149"} Jan 10 17:04:42 crc kubenswrapper[5036]: I0110 17:04:42.694930 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f" Jan 10 17:04:42 crc kubenswrapper[5036]: I0110 17:04:42.806319 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f551e3a3-cdf6-4fc6-8452-869afe1cef86-inventory\") pod \"f551e3a3-cdf6-4fc6-8452-869afe1cef86\" (UID: \"f551e3a3-cdf6-4fc6-8452-869afe1cef86\") " Jan 10 17:04:42 crc kubenswrapper[5036]: I0110 17:04:42.806495 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f551e3a3-cdf6-4fc6-8452-869afe1cef86-ssh-key-openstack-edpm-ipam\") pod \"f551e3a3-cdf6-4fc6-8452-869afe1cef86\" (UID: \"f551e3a3-cdf6-4fc6-8452-869afe1cef86\") " Jan 10 17:04:42 crc kubenswrapper[5036]: I0110 17:04:42.806528 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnx5k\" (UniqueName: \"kubernetes.io/projected/f551e3a3-cdf6-4fc6-8452-869afe1cef86-kube-api-access-mnx5k\") pod \"f551e3a3-cdf6-4fc6-8452-869afe1cef86\" (UID: \"f551e3a3-cdf6-4fc6-8452-869afe1cef86\") " Jan 10 17:04:42 crc kubenswrapper[5036]: I0110 17:04:42.806591 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f551e3a3-cdf6-4fc6-8452-869afe1cef86-ceph\") pod \"f551e3a3-cdf6-4fc6-8452-869afe1cef86\" (UID: \"f551e3a3-cdf6-4fc6-8452-869afe1cef86\") " Jan 10 17:04:42 crc kubenswrapper[5036]: I0110 17:04:42.814722 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f551e3a3-cdf6-4fc6-8452-869afe1cef86-ceph" (OuterVolumeSpecName: "ceph") pod "f551e3a3-cdf6-4fc6-8452-869afe1cef86" (UID: "f551e3a3-cdf6-4fc6-8452-869afe1cef86"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:04:42 crc kubenswrapper[5036]: I0110 17:04:42.814952 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f551e3a3-cdf6-4fc6-8452-869afe1cef86-kube-api-access-mnx5k" (OuterVolumeSpecName: "kube-api-access-mnx5k") pod "f551e3a3-cdf6-4fc6-8452-869afe1cef86" (UID: "f551e3a3-cdf6-4fc6-8452-869afe1cef86"). InnerVolumeSpecName "kube-api-access-mnx5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:04:42 crc kubenswrapper[5036]: I0110 17:04:42.841046 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f551e3a3-cdf6-4fc6-8452-869afe1cef86-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f551e3a3-cdf6-4fc6-8452-869afe1cef86" (UID: "f551e3a3-cdf6-4fc6-8452-869afe1cef86"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:04:42 crc kubenswrapper[5036]: I0110 17:04:42.845889 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f551e3a3-cdf6-4fc6-8452-869afe1cef86-inventory" (OuterVolumeSpecName: "inventory") pod "f551e3a3-cdf6-4fc6-8452-869afe1cef86" (UID: "f551e3a3-cdf6-4fc6-8452-869afe1cef86"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:04:42 crc kubenswrapper[5036]: I0110 17:04:42.908301 5036 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f551e3a3-cdf6-4fc6-8452-869afe1cef86-ceph\") on node \"crc\" DevicePath \"\"" Jan 10 17:04:42 crc kubenswrapper[5036]: I0110 17:04:42.908336 5036 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f551e3a3-cdf6-4fc6-8452-869afe1cef86-inventory\") on node \"crc\" DevicePath \"\"" Jan 10 17:04:42 crc kubenswrapper[5036]: I0110 17:04:42.908353 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f551e3a3-cdf6-4fc6-8452-869afe1cef86-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 17:04:42 crc kubenswrapper[5036]: I0110 17:04:42.908372 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnx5k\" (UniqueName: \"kubernetes.io/projected/f551e3a3-cdf6-4fc6-8452-869afe1cef86-kube-api-access-mnx5k\") on node \"crc\" DevicePath \"\"" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.255905 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f" event={"ID":"f551e3a3-cdf6-4fc6-8452-869afe1cef86","Type":"ContainerDied","Data":"b2df03f3338cbb9839c6574f0e5b1dbae6db63b9e6da0668a12ccd19dea6b254"} Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.256177 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2df03f3338cbb9839c6574f0e5b1dbae6db63b9e6da0668a12ccd19dea6b254" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.255996 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.408988 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9"] Jan 10 17:04:43 crc kubenswrapper[5036]: E0110 17:04:43.409545 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f551e3a3-cdf6-4fc6-8452-869afe1cef86" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.409578 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="f551e3a3-cdf6-4fc6-8452-869afe1cef86" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.409909 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="f551e3a3-cdf6-4fc6-8452-869afe1cef86" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.410860 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.415336 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.415553 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.415713 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.417265 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.417523 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.417814 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.418627 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.422376 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.432868 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9"] Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.518462 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.518524 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.518556 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.518580 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/421d37b9-14cd-4270-b305-c6f946cd32a3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.518610 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.518645 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.518709 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.518739 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/421d37b9-14cd-4270-b305-c6f946cd32a3-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.518770 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zxck\" (UniqueName: \"kubernetes.io/projected/421d37b9-14cd-4270-b305-c6f946cd32a3-kube-api-access-9zxck\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.518945 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.519044 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.519188 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.519242 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/421d37b9-14cd-4270-b305-c6f946cd32a3-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.621557 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/421d37b9-14cd-4270-b305-c6f946cd32a3-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.621658 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.621794 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.622565 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.622623 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/421d37b9-14cd-4270-b305-c6f946cd32a3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.622662 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.622833 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.622972 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.623032 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/421d37b9-14cd-4270-b305-c6f946cd32a3-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.623090 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zxck\" (UniqueName: \"kubernetes.io/projected/421d37b9-14cd-4270-b305-c6f946cd32a3-kube-api-access-9zxck\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.623189 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.623236 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.623295 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.626798 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/421d37b9-14cd-4270-b305-c6f946cd32a3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.627079 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.627872 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.628825 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.628957 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/421d37b9-14cd-4270-b305-c6f946cd32a3-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.629473 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.630071 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/421d37b9-14cd-4270-b305-c6f946cd32a3-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.630479 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.631428 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.632467 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.637285 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.644366 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zxck\" (UniqueName: \"kubernetes.io/projected/421d37b9-14cd-4270-b305-c6f946cd32a3-kube-api-access-9zxck\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.644568 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:43 crc kubenswrapper[5036]: I0110 17:04:43.730795 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:04:44 crc kubenswrapper[5036]: I0110 17:04:44.337819 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9"] Jan 10 17:04:45 crc kubenswrapper[5036]: I0110 17:04:45.272817 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" event={"ID":"421d37b9-14cd-4270-b305-c6f946cd32a3","Type":"ContainerStarted","Data":"4dda6bfd075cdb41f85b5d43617ec23666ea1dab24ebad21520673806b571ac2"} Jan 10 17:04:45 crc kubenswrapper[5036]: I0110 17:04:45.273228 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" event={"ID":"421d37b9-14cd-4270-b305-c6f946cd32a3","Type":"ContainerStarted","Data":"5ab28cca83a94be486816f72e3da0ec4ff9e9bbf7fbd2d67fc876b57bebe2ca0"} Jan 10 17:04:45 crc kubenswrapper[5036]: I0110 17:04:45.304184 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" podStartSLOduration=1.777593389 podStartE2EDuration="2.304163536s" podCreationTimestamp="2026-01-10 17:04:43 +0000 UTC" firstStartedPulling="2026-01-10 17:04:44.336422876 +0000 UTC m=+2206.206658390" lastFinishedPulling="2026-01-10 17:04:44.862993043 +0000 UTC m=+2206.733228537" observedRunningTime="2026-01-10 17:04:45.291185384 +0000 UTC m=+2207.161420888" watchObservedRunningTime="2026-01-10 17:04:45.304163536 +0000 UTC m=+2207.174399040" Jan 10 17:05:17 crc kubenswrapper[5036]: I0110 17:05:17.548255 5036 generic.go:334] "Generic (PLEG): container finished" podID="421d37b9-14cd-4270-b305-c6f946cd32a3" containerID="4dda6bfd075cdb41f85b5d43617ec23666ea1dab24ebad21520673806b571ac2" exitCode=0 Jan 10 17:05:17 crc kubenswrapper[5036]: I0110 17:05:17.548363 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" event={"ID":"421d37b9-14cd-4270-b305-c6f946cd32a3","Type":"ContainerDied","Data":"4dda6bfd075cdb41f85b5d43617ec23666ea1dab24ebad21520673806b571ac2"} Jan 10 17:05:18 crc kubenswrapper[5036]: I0110 17:05:18.971606 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.076909 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-bootstrap-combined-ca-bundle\") pod \"421d37b9-14cd-4270-b305-c6f946cd32a3\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.076989 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-libvirt-combined-ca-bundle\") pod \"421d37b9-14cd-4270-b305-c6f946cd32a3\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.077044 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/421d37b9-14cd-4270-b305-c6f946cd32a3-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"421d37b9-14cd-4270-b305-c6f946cd32a3\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.077117 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-nova-combined-ca-bundle\") pod \"421d37b9-14cd-4270-b305-c6f946cd32a3\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.077152 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zxck\" (UniqueName: \"kubernetes.io/projected/421d37b9-14cd-4270-b305-c6f946cd32a3-kube-api-access-9zxck\") pod \"421d37b9-14cd-4270-b305-c6f946cd32a3\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.077252 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/421d37b9-14cd-4270-b305-c6f946cd32a3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"421d37b9-14cd-4270-b305-c6f946cd32a3\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.077306 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-neutron-metadata-combined-ca-bundle\") pod \"421d37b9-14cd-4270-b305-c6f946cd32a3\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.077344 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/421d37b9-14cd-4270-b305-c6f946cd32a3-openstack-edpm-ipam-ovn-default-certs-0\") pod \"421d37b9-14cd-4270-b305-c6f946cd32a3\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.077403 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-ssh-key-openstack-edpm-ipam\") pod \"421d37b9-14cd-4270-b305-c6f946cd32a3\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.077455 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-inventory\") pod \"421d37b9-14cd-4270-b305-c6f946cd32a3\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.077520 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-ovn-combined-ca-bundle\") pod \"421d37b9-14cd-4270-b305-c6f946cd32a3\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.077590 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-ceph\") pod \"421d37b9-14cd-4270-b305-c6f946cd32a3\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.077647 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-repo-setup-combined-ca-bundle\") pod \"421d37b9-14cd-4270-b305-c6f946cd32a3\" (UID: \"421d37b9-14cd-4270-b305-c6f946cd32a3\") " Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.083701 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "421d37b9-14cd-4270-b305-c6f946cd32a3" (UID: "421d37b9-14cd-4270-b305-c6f946cd32a3"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.084173 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "421d37b9-14cd-4270-b305-c6f946cd32a3" (UID: "421d37b9-14cd-4270-b305-c6f946cd32a3"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.084918 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "421d37b9-14cd-4270-b305-c6f946cd32a3" (UID: "421d37b9-14cd-4270-b305-c6f946cd32a3"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.085420 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/421d37b9-14cd-4270-b305-c6f946cd32a3-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "421d37b9-14cd-4270-b305-c6f946cd32a3" (UID: "421d37b9-14cd-4270-b305-c6f946cd32a3"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.085916 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/421d37b9-14cd-4270-b305-c6f946cd32a3-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "421d37b9-14cd-4270-b305-c6f946cd32a3" (UID: "421d37b9-14cd-4270-b305-c6f946cd32a3"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.086033 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "421d37b9-14cd-4270-b305-c6f946cd32a3" (UID: "421d37b9-14cd-4270-b305-c6f946cd32a3"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.086872 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/421d37b9-14cd-4270-b305-c6f946cd32a3-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "421d37b9-14cd-4270-b305-c6f946cd32a3" (UID: "421d37b9-14cd-4270-b305-c6f946cd32a3"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.087617 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-ceph" (OuterVolumeSpecName: "ceph") pod "421d37b9-14cd-4270-b305-c6f946cd32a3" (UID: "421d37b9-14cd-4270-b305-c6f946cd32a3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.087705 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/421d37b9-14cd-4270-b305-c6f946cd32a3-kube-api-access-9zxck" (OuterVolumeSpecName: "kube-api-access-9zxck") pod "421d37b9-14cd-4270-b305-c6f946cd32a3" (UID: "421d37b9-14cd-4270-b305-c6f946cd32a3"). InnerVolumeSpecName "kube-api-access-9zxck". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.087979 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "421d37b9-14cd-4270-b305-c6f946cd32a3" (UID: "421d37b9-14cd-4270-b305-c6f946cd32a3"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.097262 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "421d37b9-14cd-4270-b305-c6f946cd32a3" (UID: "421d37b9-14cd-4270-b305-c6f946cd32a3"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.109058 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-inventory" (OuterVolumeSpecName: "inventory") pod "421d37b9-14cd-4270-b305-c6f946cd32a3" (UID: "421d37b9-14cd-4270-b305-c6f946cd32a3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.109598 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "421d37b9-14cd-4270-b305-c6f946cd32a3" (UID: "421d37b9-14cd-4270-b305-c6f946cd32a3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.179550 5036 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-inventory\") on node \"crc\" DevicePath \"\"" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.179592 5036 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.179606 5036 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-ceph\") on node \"crc\" DevicePath \"\"" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.179618 5036 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.179634 5036 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.179648 5036 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.179661 5036 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/421d37b9-14cd-4270-b305-c6f946cd32a3-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.179692 5036 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.179703 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zxck\" (UniqueName: \"kubernetes.io/projected/421d37b9-14cd-4270-b305-c6f946cd32a3-kube-api-access-9zxck\") on node \"crc\" DevicePath \"\"" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.179715 5036 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/421d37b9-14cd-4270-b305-c6f946cd32a3-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.179727 5036 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.179739 5036 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/421d37b9-14cd-4270-b305-c6f946cd32a3-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.179750 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/421d37b9-14cd-4270-b305-c6f946cd32a3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.567500 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" event={"ID":"421d37b9-14cd-4270-b305-c6f946cd32a3","Type":"ContainerDied","Data":"5ab28cca83a94be486816f72e3da0ec4ff9e9bbf7fbd2d67fc876b57bebe2ca0"} Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.567547 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ab28cca83a94be486816f72e3da0ec4ff9e9bbf7fbd2d67fc876b57bebe2ca0" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.567615 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.703930 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr"] Jan 10 17:05:19 crc kubenswrapper[5036]: E0110 17:05:19.704737 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="421d37b9-14cd-4270-b305-c6f946cd32a3" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.704753 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="421d37b9-14cd-4270-b305-c6f946cd32a3" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.704954 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="421d37b9-14cd-4270-b305-c6f946cd32a3" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.705608 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.708218 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.709144 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.709822 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.709947 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.726384 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.730586 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr"] Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.889709 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35435ad9-1b59-46c6-b2c7-a57b43c65a3d-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr\" (UID: \"35435ad9-1b59-46c6-b2c7-a57b43c65a3d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.889775 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35435ad9-1b59-46c6-b2c7-a57b43c65a3d-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr\" (UID: \"35435ad9-1b59-46c6-b2c7-a57b43c65a3d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.889809 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rplbw\" (UniqueName: \"kubernetes.io/projected/35435ad9-1b59-46c6-b2c7-a57b43c65a3d-kube-api-access-rplbw\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr\" (UID: \"35435ad9-1b59-46c6-b2c7-a57b43c65a3d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.889851 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/35435ad9-1b59-46c6-b2c7-a57b43c65a3d-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr\" (UID: \"35435ad9-1b59-46c6-b2c7-a57b43c65a3d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.991630 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/35435ad9-1b59-46c6-b2c7-a57b43c65a3d-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr\" (UID: \"35435ad9-1b59-46c6-b2c7-a57b43c65a3d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.991809 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35435ad9-1b59-46c6-b2c7-a57b43c65a3d-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr\" (UID: \"35435ad9-1b59-46c6-b2c7-a57b43c65a3d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.991866 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35435ad9-1b59-46c6-b2c7-a57b43c65a3d-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr\" (UID: \"35435ad9-1b59-46c6-b2c7-a57b43c65a3d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.991895 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rplbw\" (UniqueName: \"kubernetes.io/projected/35435ad9-1b59-46c6-b2c7-a57b43c65a3d-kube-api-access-rplbw\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr\" (UID: \"35435ad9-1b59-46c6-b2c7-a57b43c65a3d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr" Jan 10 17:05:19 crc kubenswrapper[5036]: I0110 17:05:19.998196 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35435ad9-1b59-46c6-b2c7-a57b43c65a3d-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr\" (UID: \"35435ad9-1b59-46c6-b2c7-a57b43c65a3d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr" Jan 10 17:05:20 crc kubenswrapper[5036]: I0110 17:05:19.999971 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/35435ad9-1b59-46c6-b2c7-a57b43c65a3d-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr\" (UID: \"35435ad9-1b59-46c6-b2c7-a57b43c65a3d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr" Jan 10 17:05:20 crc kubenswrapper[5036]: I0110 17:05:20.000423 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35435ad9-1b59-46c6-b2c7-a57b43c65a3d-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr\" (UID: \"35435ad9-1b59-46c6-b2c7-a57b43c65a3d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr" Jan 10 17:05:20 crc kubenswrapper[5036]: I0110 17:05:20.014071 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rplbw\" (UniqueName: \"kubernetes.io/projected/35435ad9-1b59-46c6-b2c7-a57b43c65a3d-kube-api-access-rplbw\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr\" (UID: \"35435ad9-1b59-46c6-b2c7-a57b43c65a3d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr" Jan 10 17:05:20 crc kubenswrapper[5036]: I0110 17:05:20.035934 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr" Jan 10 17:05:20 crc kubenswrapper[5036]: I0110 17:05:20.565029 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr"] Jan 10 17:05:20 crc kubenswrapper[5036]: I0110 17:05:20.572306 5036 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 10 17:05:21 crc kubenswrapper[5036]: I0110 17:05:21.590483 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr" event={"ID":"35435ad9-1b59-46c6-b2c7-a57b43c65a3d","Type":"ContainerStarted","Data":"af85b42344182263bcc969e151365654b853e55a3084fbc20b1fb4bce2c7b576"} Jan 10 17:05:21 crc kubenswrapper[5036]: I0110 17:05:21.591349 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr" event={"ID":"35435ad9-1b59-46c6-b2c7-a57b43c65a3d","Type":"ContainerStarted","Data":"2b2e0160099a5edaa1b00735bb02a87ce5a126b601ea8802e3142c1d9df8334e"} Jan 10 17:05:21 crc kubenswrapper[5036]: I0110 17:05:21.624935 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr" podStartSLOduration=2.110643116 podStartE2EDuration="2.624905311s" podCreationTimestamp="2026-01-10 17:05:19 +0000 UTC" firstStartedPulling="2026-01-10 17:05:20.572070384 +0000 UTC m=+2242.442305878" lastFinishedPulling="2026-01-10 17:05:21.086332579 +0000 UTC m=+2242.956568073" observedRunningTime="2026-01-10 17:05:21.612885646 +0000 UTC m=+2243.483121180" watchObservedRunningTime="2026-01-10 17:05:21.624905311 +0000 UTC m=+2243.495140845" Jan 10 17:05:26 crc kubenswrapper[5036]: I0110 17:05:26.637015 5036 generic.go:334] "Generic (PLEG): container finished" podID="35435ad9-1b59-46c6-b2c7-a57b43c65a3d" containerID="af85b42344182263bcc969e151365654b853e55a3084fbc20b1fb4bce2c7b576" exitCode=0 Jan 10 17:05:26 crc kubenswrapper[5036]: I0110 17:05:26.637101 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr" event={"ID":"35435ad9-1b59-46c6-b2c7-a57b43c65a3d","Type":"ContainerDied","Data":"af85b42344182263bcc969e151365654b853e55a3084fbc20b1fb4bce2c7b576"} Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.105272 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.250524 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/35435ad9-1b59-46c6-b2c7-a57b43c65a3d-ceph\") pod \"35435ad9-1b59-46c6-b2c7-a57b43c65a3d\" (UID: \"35435ad9-1b59-46c6-b2c7-a57b43c65a3d\") " Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.250831 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rplbw\" (UniqueName: \"kubernetes.io/projected/35435ad9-1b59-46c6-b2c7-a57b43c65a3d-kube-api-access-rplbw\") pod \"35435ad9-1b59-46c6-b2c7-a57b43c65a3d\" (UID: \"35435ad9-1b59-46c6-b2c7-a57b43c65a3d\") " Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.250908 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35435ad9-1b59-46c6-b2c7-a57b43c65a3d-ssh-key-openstack-edpm-ipam\") pod \"35435ad9-1b59-46c6-b2c7-a57b43c65a3d\" (UID: \"35435ad9-1b59-46c6-b2c7-a57b43c65a3d\") " Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.251034 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35435ad9-1b59-46c6-b2c7-a57b43c65a3d-inventory\") pod \"35435ad9-1b59-46c6-b2c7-a57b43c65a3d\" (UID: \"35435ad9-1b59-46c6-b2c7-a57b43c65a3d\") " Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.256855 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35435ad9-1b59-46c6-b2c7-a57b43c65a3d-kube-api-access-rplbw" (OuterVolumeSpecName: "kube-api-access-rplbw") pod "35435ad9-1b59-46c6-b2c7-a57b43c65a3d" (UID: "35435ad9-1b59-46c6-b2c7-a57b43c65a3d"). InnerVolumeSpecName "kube-api-access-rplbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.260639 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35435ad9-1b59-46c6-b2c7-a57b43c65a3d-ceph" (OuterVolumeSpecName: "ceph") pod "35435ad9-1b59-46c6-b2c7-a57b43c65a3d" (UID: "35435ad9-1b59-46c6-b2c7-a57b43c65a3d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.276329 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35435ad9-1b59-46c6-b2c7-a57b43c65a3d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "35435ad9-1b59-46c6-b2c7-a57b43c65a3d" (UID: "35435ad9-1b59-46c6-b2c7-a57b43c65a3d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.295284 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35435ad9-1b59-46c6-b2c7-a57b43c65a3d-inventory" (OuterVolumeSpecName: "inventory") pod "35435ad9-1b59-46c6-b2c7-a57b43c65a3d" (UID: "35435ad9-1b59-46c6-b2c7-a57b43c65a3d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.352814 5036 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35435ad9-1b59-46c6-b2c7-a57b43c65a3d-inventory\") on node \"crc\" DevicePath \"\"" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.352866 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rplbw\" (UniqueName: \"kubernetes.io/projected/35435ad9-1b59-46c6-b2c7-a57b43c65a3d-kube-api-access-rplbw\") on node \"crc\" DevicePath \"\"" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.352881 5036 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/35435ad9-1b59-46c6-b2c7-a57b43c65a3d-ceph\") on node \"crc\" DevicePath \"\"" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.352889 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/35435ad9-1b59-46c6-b2c7-a57b43c65a3d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.656894 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr" event={"ID":"35435ad9-1b59-46c6-b2c7-a57b43c65a3d","Type":"ContainerDied","Data":"2b2e0160099a5edaa1b00735bb02a87ce5a126b601ea8802e3142c1d9df8334e"} Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.656967 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b2e0160099a5edaa1b00735bb02a87ce5a126b601ea8802e3142c1d9df8334e" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.656985 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.810171 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj"] Jan 10 17:05:28 crc kubenswrapper[5036]: E0110 17:05:28.810588 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35435ad9-1b59-46c6-b2c7-a57b43c65a3d" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.810610 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="35435ad9-1b59-46c6-b2c7-a57b43c65a3d" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.810817 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="35435ad9-1b59-46c6-b2c7-a57b43c65a3d" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.811469 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.813511 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.813867 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.822936 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj"] Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.829715 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.829763 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.830005 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.830265 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.970340 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cb46990-94ee-4a82-93a2-a30c563f1146-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-skrfj\" (UID: \"7cb46990-94ee-4a82-93a2-a30c563f1146\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.970416 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cb46990-94ee-4a82-93a2-a30c563f1146-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-skrfj\" (UID: \"7cb46990-94ee-4a82-93a2-a30c563f1146\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.970453 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7cb46990-94ee-4a82-93a2-a30c563f1146-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-skrfj\" (UID: \"7cb46990-94ee-4a82-93a2-a30c563f1146\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.970516 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7cb46990-94ee-4a82-93a2-a30c563f1146-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-skrfj\" (UID: \"7cb46990-94ee-4a82-93a2-a30c563f1146\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.970541 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hppmc\" (UniqueName: \"kubernetes.io/projected/7cb46990-94ee-4a82-93a2-a30c563f1146-kube-api-access-hppmc\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-skrfj\" (UID: \"7cb46990-94ee-4a82-93a2-a30c563f1146\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" Jan 10 17:05:28 crc kubenswrapper[5036]: I0110 17:05:28.970619 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb46990-94ee-4a82-93a2-a30c563f1146-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-skrfj\" (UID: \"7cb46990-94ee-4a82-93a2-a30c563f1146\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" Jan 10 17:05:29 crc kubenswrapper[5036]: I0110 17:05:29.072010 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb46990-94ee-4a82-93a2-a30c563f1146-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-skrfj\" (UID: \"7cb46990-94ee-4a82-93a2-a30c563f1146\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" Jan 10 17:05:29 crc kubenswrapper[5036]: I0110 17:05:29.072113 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cb46990-94ee-4a82-93a2-a30c563f1146-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-skrfj\" (UID: \"7cb46990-94ee-4a82-93a2-a30c563f1146\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" Jan 10 17:05:29 crc kubenswrapper[5036]: I0110 17:05:29.072182 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cb46990-94ee-4a82-93a2-a30c563f1146-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-skrfj\" (UID: \"7cb46990-94ee-4a82-93a2-a30c563f1146\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" Jan 10 17:05:29 crc kubenswrapper[5036]: I0110 17:05:29.072719 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7cb46990-94ee-4a82-93a2-a30c563f1146-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-skrfj\" (UID: \"7cb46990-94ee-4a82-93a2-a30c563f1146\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" Jan 10 17:05:29 crc kubenswrapper[5036]: I0110 17:05:29.072800 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7cb46990-94ee-4a82-93a2-a30c563f1146-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-skrfj\" (UID: \"7cb46990-94ee-4a82-93a2-a30c563f1146\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" Jan 10 17:05:29 crc kubenswrapper[5036]: I0110 17:05:29.072829 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hppmc\" (UniqueName: \"kubernetes.io/projected/7cb46990-94ee-4a82-93a2-a30c563f1146-kube-api-access-hppmc\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-skrfj\" (UID: \"7cb46990-94ee-4a82-93a2-a30c563f1146\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" Jan 10 17:05:29 crc kubenswrapper[5036]: I0110 17:05:29.073815 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7cb46990-94ee-4a82-93a2-a30c563f1146-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-skrfj\" (UID: \"7cb46990-94ee-4a82-93a2-a30c563f1146\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" Jan 10 17:05:29 crc kubenswrapper[5036]: I0110 17:05:29.076638 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb46990-94ee-4a82-93a2-a30c563f1146-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-skrfj\" (UID: \"7cb46990-94ee-4a82-93a2-a30c563f1146\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" Jan 10 17:05:29 crc kubenswrapper[5036]: I0110 17:05:29.078072 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cb46990-94ee-4a82-93a2-a30c563f1146-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-skrfj\" (UID: \"7cb46990-94ee-4a82-93a2-a30c563f1146\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" Jan 10 17:05:29 crc kubenswrapper[5036]: I0110 17:05:29.078319 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cb46990-94ee-4a82-93a2-a30c563f1146-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-skrfj\" (UID: \"7cb46990-94ee-4a82-93a2-a30c563f1146\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" Jan 10 17:05:29 crc kubenswrapper[5036]: I0110 17:05:29.079123 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7cb46990-94ee-4a82-93a2-a30c563f1146-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-skrfj\" (UID: \"7cb46990-94ee-4a82-93a2-a30c563f1146\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" Jan 10 17:05:29 crc kubenswrapper[5036]: I0110 17:05:29.090467 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hppmc\" (UniqueName: \"kubernetes.io/projected/7cb46990-94ee-4a82-93a2-a30c563f1146-kube-api-access-hppmc\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-skrfj\" (UID: \"7cb46990-94ee-4a82-93a2-a30c563f1146\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" Jan 10 17:05:29 crc kubenswrapper[5036]: I0110 17:05:29.180108 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" Jan 10 17:05:29 crc kubenswrapper[5036]: I0110 17:05:29.767979 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj"] Jan 10 17:05:30 crc kubenswrapper[5036]: I0110 17:05:30.680054 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" event={"ID":"7cb46990-94ee-4a82-93a2-a30c563f1146","Type":"ContainerStarted","Data":"dafa46ae29452b3ae1e4e0b7fad49f873c12cc5127e4ffe5db2f418470459c22"} Jan 10 17:05:30 crc kubenswrapper[5036]: I0110 17:05:30.680469 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" event={"ID":"7cb46990-94ee-4a82-93a2-a30c563f1146","Type":"ContainerStarted","Data":"9ac43b9ad039da19a0e22c70363635bcc10533f7a85a7451ebc6340d832601c9"} Jan 10 17:05:30 crc kubenswrapper[5036]: I0110 17:05:30.714054 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" podStartSLOduration=2.144014045 podStartE2EDuration="2.714030887s" podCreationTimestamp="2026-01-10 17:05:28 +0000 UTC" firstStartedPulling="2026-01-10 17:05:29.777969634 +0000 UTC m=+2251.648205128" lastFinishedPulling="2026-01-10 17:05:30.347986476 +0000 UTC m=+2252.218221970" observedRunningTime="2026-01-10 17:05:30.704634718 +0000 UTC m=+2252.574870242" watchObservedRunningTime="2026-01-10 17:05:30.714030887 +0000 UTC m=+2252.584266391" Jan 10 17:06:44 crc kubenswrapper[5036]: I0110 17:06:44.374960 5036 generic.go:334] "Generic (PLEG): container finished" podID="7cb46990-94ee-4a82-93a2-a30c563f1146" containerID="dafa46ae29452b3ae1e4e0b7fad49f873c12cc5127e4ffe5db2f418470459c22" exitCode=0 Jan 10 17:06:44 crc kubenswrapper[5036]: I0110 17:06:44.375015 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" event={"ID":"7cb46990-94ee-4a82-93a2-a30c563f1146","Type":"ContainerDied","Data":"dafa46ae29452b3ae1e4e0b7fad49f873c12cc5127e4ffe5db2f418470459c22"} Jan 10 17:06:45 crc kubenswrapper[5036]: I0110 17:06:45.751908 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" Jan 10 17:06:45 crc kubenswrapper[5036]: I0110 17:06:45.850236 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7cb46990-94ee-4a82-93a2-a30c563f1146-ceph\") pod \"7cb46990-94ee-4a82-93a2-a30c563f1146\" (UID: \"7cb46990-94ee-4a82-93a2-a30c563f1146\") " Jan 10 17:06:45 crc kubenswrapper[5036]: I0110 17:06:45.850289 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb46990-94ee-4a82-93a2-a30c563f1146-ovn-combined-ca-bundle\") pod \"7cb46990-94ee-4a82-93a2-a30c563f1146\" (UID: \"7cb46990-94ee-4a82-93a2-a30c563f1146\") " Jan 10 17:06:45 crc kubenswrapper[5036]: I0110 17:06:45.850331 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hppmc\" (UniqueName: \"kubernetes.io/projected/7cb46990-94ee-4a82-93a2-a30c563f1146-kube-api-access-hppmc\") pod \"7cb46990-94ee-4a82-93a2-a30c563f1146\" (UID: \"7cb46990-94ee-4a82-93a2-a30c563f1146\") " Jan 10 17:06:45 crc kubenswrapper[5036]: I0110 17:06:45.850396 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cb46990-94ee-4a82-93a2-a30c563f1146-ssh-key-openstack-edpm-ipam\") pod \"7cb46990-94ee-4a82-93a2-a30c563f1146\" (UID: \"7cb46990-94ee-4a82-93a2-a30c563f1146\") " Jan 10 17:06:45 crc kubenswrapper[5036]: I0110 17:06:45.850520 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cb46990-94ee-4a82-93a2-a30c563f1146-inventory\") pod \"7cb46990-94ee-4a82-93a2-a30c563f1146\" (UID: \"7cb46990-94ee-4a82-93a2-a30c563f1146\") " Jan 10 17:06:45 crc kubenswrapper[5036]: I0110 17:06:45.850582 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7cb46990-94ee-4a82-93a2-a30c563f1146-ovncontroller-config-0\") pod \"7cb46990-94ee-4a82-93a2-a30c563f1146\" (UID: \"7cb46990-94ee-4a82-93a2-a30c563f1146\") " Jan 10 17:06:45 crc kubenswrapper[5036]: I0110 17:06:45.859603 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb46990-94ee-4a82-93a2-a30c563f1146-ceph" (OuterVolumeSpecName: "ceph") pod "7cb46990-94ee-4a82-93a2-a30c563f1146" (UID: "7cb46990-94ee-4a82-93a2-a30c563f1146"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:06:45 crc kubenswrapper[5036]: I0110 17:06:45.859708 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb46990-94ee-4a82-93a2-a30c563f1146-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "7cb46990-94ee-4a82-93a2-a30c563f1146" (UID: "7cb46990-94ee-4a82-93a2-a30c563f1146"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:06:45 crc kubenswrapper[5036]: I0110 17:06:45.867214 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cb46990-94ee-4a82-93a2-a30c563f1146-kube-api-access-hppmc" (OuterVolumeSpecName: "kube-api-access-hppmc") pod "7cb46990-94ee-4a82-93a2-a30c563f1146" (UID: "7cb46990-94ee-4a82-93a2-a30c563f1146"). InnerVolumeSpecName "kube-api-access-hppmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:06:45 crc kubenswrapper[5036]: I0110 17:06:45.886708 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb46990-94ee-4a82-93a2-a30c563f1146-inventory" (OuterVolumeSpecName: "inventory") pod "7cb46990-94ee-4a82-93a2-a30c563f1146" (UID: "7cb46990-94ee-4a82-93a2-a30c563f1146"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:06:45 crc kubenswrapper[5036]: I0110 17:06:45.896008 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb46990-94ee-4a82-93a2-a30c563f1146-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7cb46990-94ee-4a82-93a2-a30c563f1146" (UID: "7cb46990-94ee-4a82-93a2-a30c563f1146"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:06:45 crc kubenswrapper[5036]: I0110 17:06:45.896051 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cb46990-94ee-4a82-93a2-a30c563f1146-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "7cb46990-94ee-4a82-93a2-a30c563f1146" (UID: "7cb46990-94ee-4a82-93a2-a30c563f1146"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 17:06:45 crc kubenswrapper[5036]: I0110 17:06:45.953003 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cb46990-94ee-4a82-93a2-a30c563f1146-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 17:06:45 crc kubenswrapper[5036]: I0110 17:06:45.953070 5036 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cb46990-94ee-4a82-93a2-a30c563f1146-inventory\") on node \"crc\" DevicePath \"\"" Jan 10 17:06:45 crc kubenswrapper[5036]: I0110 17:06:45.953089 5036 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/7cb46990-94ee-4a82-93a2-a30c563f1146-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 10 17:06:45 crc kubenswrapper[5036]: I0110 17:06:45.953108 5036 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7cb46990-94ee-4a82-93a2-a30c563f1146-ceph\") on node \"crc\" DevicePath \"\"" Jan 10 17:06:45 crc kubenswrapper[5036]: I0110 17:06:45.953127 5036 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb46990-94ee-4a82-93a2-a30c563f1146-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 17:06:45 crc kubenswrapper[5036]: I0110 17:06:45.953142 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hppmc\" (UniqueName: \"kubernetes.io/projected/7cb46990-94ee-4a82-93a2-a30c563f1146-kube-api-access-hppmc\") on node \"crc\" DevicePath \"\"" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.393861 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" event={"ID":"7cb46990-94ee-4a82-93a2-a30c563f1146","Type":"ContainerDied","Data":"9ac43b9ad039da19a0e22c70363635bcc10533f7a85a7451ebc6340d832601c9"} Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.393903 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ac43b9ad039da19a0e22c70363635bcc10533f7a85a7451ebc6340d832601c9" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.393976 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-skrfj" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.530328 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x"] Jan 10 17:06:46 crc kubenswrapper[5036]: E0110 17:06:46.531036 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb46990-94ee-4a82-93a2-a30c563f1146" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.531080 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb46990-94ee-4a82-93a2-a30c563f1146" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.531510 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb46990-94ee-4a82-93a2-a30c563f1146" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.532736 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.536581 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.536592 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.536875 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.536976 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.537489 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.537870 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.539292 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.553667 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x"] Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.667069 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.667124 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.667159 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsgrx\" (UniqueName: \"kubernetes.io/projected/3f111a6e-f987-4636-aada-aee2793d5047-kube-api-access-xsgrx\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.667192 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.667343 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.667430 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.667490 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.769555 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.769640 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.769704 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.769826 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.769857 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.769891 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsgrx\" (UniqueName: \"kubernetes.io/projected/3f111a6e-f987-4636-aada-aee2793d5047-kube-api-access-xsgrx\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.769922 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.774702 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.775402 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.775961 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.776279 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.778775 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.783715 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.794210 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsgrx\" (UniqueName: \"kubernetes.io/projected/3f111a6e-f987-4636-aada-aee2793d5047-kube-api-access-xsgrx\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:06:46 crc kubenswrapper[5036]: I0110 17:06:46.864319 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:06:47 crc kubenswrapper[5036]: W0110 17:06:47.464898 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f111a6e_f987_4636_aada_aee2793d5047.slice/crio-a553959703e40e3da0266d417b3a690c053dbb11c75f6e5db3e883c784cc1c64 WatchSource:0}: Error finding container a553959703e40e3da0266d417b3a690c053dbb11c75f6e5db3e883c784cc1c64: Status 404 returned error can't find the container with id a553959703e40e3da0266d417b3a690c053dbb11c75f6e5db3e883c784cc1c64 Jan 10 17:06:47 crc kubenswrapper[5036]: I0110 17:06:47.471844 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x"] Jan 10 17:06:48 crc kubenswrapper[5036]: I0110 17:06:48.416454 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" event={"ID":"3f111a6e-f987-4636-aada-aee2793d5047","Type":"ContainerStarted","Data":"0474fe33dc22330ea28c641c47e772712dd2f53b8a1b2878412165569355a187"} Jan 10 17:06:48 crc kubenswrapper[5036]: I0110 17:06:48.418215 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" event={"ID":"3f111a6e-f987-4636-aada-aee2793d5047","Type":"ContainerStarted","Data":"a553959703e40e3da0266d417b3a690c053dbb11c75f6e5db3e883c784cc1c64"} Jan 10 17:06:48 crc kubenswrapper[5036]: I0110 17:06:48.449839 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" podStartSLOduration=1.887216251 podStartE2EDuration="2.449815783s" podCreationTimestamp="2026-01-10 17:06:46 +0000 UTC" firstStartedPulling="2026-01-10 17:06:47.468908492 +0000 UTC m=+2329.339143986" lastFinishedPulling="2026-01-10 17:06:48.031508014 +0000 UTC m=+2329.901743518" observedRunningTime="2026-01-10 17:06:48.446091276 +0000 UTC m=+2330.316326780" watchObservedRunningTime="2026-01-10 17:06:48.449815783 +0000 UTC m=+2330.320051287" Jan 10 17:06:55 crc kubenswrapper[5036]: I0110 17:06:55.904479 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 17:06:55 crc kubenswrapper[5036]: I0110 17:06:55.905252 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 17:07:25 crc kubenswrapper[5036]: I0110 17:07:25.904202 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 17:07:25 crc kubenswrapper[5036]: I0110 17:07:25.904710 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 17:07:51 crc kubenswrapper[5036]: I0110 17:07:51.003408 5036 generic.go:334] "Generic (PLEG): container finished" podID="3f111a6e-f987-4636-aada-aee2793d5047" containerID="0474fe33dc22330ea28c641c47e772712dd2f53b8a1b2878412165569355a187" exitCode=0 Jan 10 17:07:51 crc kubenswrapper[5036]: I0110 17:07:51.003476 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" event={"ID":"3f111a6e-f987-4636-aada-aee2793d5047","Type":"ContainerDied","Data":"0474fe33dc22330ea28c641c47e772712dd2f53b8a1b2878412165569355a187"} Jan 10 17:07:52 crc kubenswrapper[5036]: I0110 17:07:52.418583 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:07:52 crc kubenswrapper[5036]: I0110 17:07:52.523630 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-nova-metadata-neutron-config-0\") pod \"3f111a6e-f987-4636-aada-aee2793d5047\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " Jan 10 17:07:52 crc kubenswrapper[5036]: I0110 17:07:52.523666 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-inventory\") pod \"3f111a6e-f987-4636-aada-aee2793d5047\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " Jan 10 17:07:52 crc kubenswrapper[5036]: I0110 17:07:52.523827 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-ceph\") pod \"3f111a6e-f987-4636-aada-aee2793d5047\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " Jan 10 17:07:52 crc kubenswrapper[5036]: I0110 17:07:52.523889 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-neutron-metadata-combined-ca-bundle\") pod \"3f111a6e-f987-4636-aada-aee2793d5047\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " Jan 10 17:07:52 crc kubenswrapper[5036]: I0110 17:07:52.523929 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsgrx\" (UniqueName: \"kubernetes.io/projected/3f111a6e-f987-4636-aada-aee2793d5047-kube-api-access-xsgrx\") pod \"3f111a6e-f987-4636-aada-aee2793d5047\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " Jan 10 17:07:52 crc kubenswrapper[5036]: I0110 17:07:52.523949 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-neutron-ovn-metadata-agent-neutron-config-0\") pod \"3f111a6e-f987-4636-aada-aee2793d5047\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " Jan 10 17:07:52 crc kubenswrapper[5036]: I0110 17:07:52.523977 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-ssh-key-openstack-edpm-ipam\") pod \"3f111a6e-f987-4636-aada-aee2793d5047\" (UID: \"3f111a6e-f987-4636-aada-aee2793d5047\") " Jan 10 17:07:52 crc kubenswrapper[5036]: I0110 17:07:52.529528 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f111a6e-f987-4636-aada-aee2793d5047-kube-api-access-xsgrx" (OuterVolumeSpecName: "kube-api-access-xsgrx") pod "3f111a6e-f987-4636-aada-aee2793d5047" (UID: "3f111a6e-f987-4636-aada-aee2793d5047"). InnerVolumeSpecName "kube-api-access-xsgrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:07:52 crc kubenswrapper[5036]: I0110 17:07:52.549381 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-ceph" (OuterVolumeSpecName: "ceph") pod "3f111a6e-f987-4636-aada-aee2793d5047" (UID: "3f111a6e-f987-4636-aada-aee2793d5047"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:07:52 crc kubenswrapper[5036]: I0110 17:07:52.549617 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "3f111a6e-f987-4636-aada-aee2793d5047" (UID: "3f111a6e-f987-4636-aada-aee2793d5047"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:07:52 crc kubenswrapper[5036]: I0110 17:07:52.554096 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "3f111a6e-f987-4636-aada-aee2793d5047" (UID: "3f111a6e-f987-4636-aada-aee2793d5047"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:07:52 crc kubenswrapper[5036]: I0110 17:07:52.555581 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "3f111a6e-f987-4636-aada-aee2793d5047" (UID: "3f111a6e-f987-4636-aada-aee2793d5047"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:07:52 crc kubenswrapper[5036]: I0110 17:07:52.556416 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3f111a6e-f987-4636-aada-aee2793d5047" (UID: "3f111a6e-f987-4636-aada-aee2793d5047"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:07:52 crc kubenswrapper[5036]: I0110 17:07:52.559202 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-inventory" (OuterVolumeSpecName: "inventory") pod "3f111a6e-f987-4636-aada-aee2793d5047" (UID: "3f111a6e-f987-4636-aada-aee2793d5047"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:07:52 crc kubenswrapper[5036]: I0110 17:07:52.626492 5036 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 17:07:52 crc kubenswrapper[5036]: I0110 17:07:52.626524 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsgrx\" (UniqueName: \"kubernetes.io/projected/3f111a6e-f987-4636-aada-aee2793d5047-kube-api-access-xsgrx\") on node \"crc\" DevicePath \"\"" Jan 10 17:07:52 crc kubenswrapper[5036]: I0110 17:07:52.626536 5036 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 10 17:07:52 crc kubenswrapper[5036]: I0110 17:07:52.626548 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 17:07:52 crc kubenswrapper[5036]: I0110 17:07:52.626557 5036 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 10 17:07:52 crc kubenswrapper[5036]: I0110 17:07:52.626566 5036 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-inventory\") on node \"crc\" DevicePath \"\"" Jan 10 17:07:52 crc kubenswrapper[5036]: I0110 17:07:52.626573 5036 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f111a6e-f987-4636-aada-aee2793d5047-ceph\") on node \"crc\" DevicePath \"\"" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.021412 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" event={"ID":"3f111a6e-f987-4636-aada-aee2793d5047","Type":"ContainerDied","Data":"a553959703e40e3da0266d417b3a690c053dbb11c75f6e5db3e883c784cc1c64"} Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.021451 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a553959703e40e3da0266d417b3a690c053dbb11c75f6e5db3e883c784cc1c64" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.021525 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.125248 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6"] Jan 10 17:07:53 crc kubenswrapper[5036]: E0110 17:07:53.125740 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f111a6e-f987-4636-aada-aee2793d5047" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.125765 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f111a6e-f987-4636-aada-aee2793d5047" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.125948 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f111a6e-f987-4636-aada-aee2793d5047" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.126659 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.128641 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.128708 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.129281 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.129356 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.129445 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.130439 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.147443 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6"] Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.244574 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6\" (UID: \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.244646 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6\" (UID: \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.245029 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6\" (UID: \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.245148 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrkv8\" (UniqueName: \"kubernetes.io/projected/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-kube-api-access-rrkv8\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6\" (UID: \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.245213 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6\" (UID: \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.245366 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6\" (UID: \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.347954 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6\" (UID: \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.348043 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6\" (UID: \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.348182 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6\" (UID: \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.348265 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrkv8\" (UniqueName: \"kubernetes.io/projected/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-kube-api-access-rrkv8\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6\" (UID: \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.348325 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6\" (UID: \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.348399 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6\" (UID: \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.353782 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6\" (UID: \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.354785 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6\" (UID: \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.354935 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6\" (UID: \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.355978 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6\" (UID: \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.358993 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6\" (UID: \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.370429 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrkv8\" (UniqueName: \"kubernetes.io/projected/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-kube-api-access-rrkv8\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6\" (UID: \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" Jan 10 17:07:53 crc kubenswrapper[5036]: I0110 17:07:53.443006 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" Jan 10 17:07:54 crc kubenswrapper[5036]: I0110 17:07:54.099405 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6"] Jan 10 17:07:55 crc kubenswrapper[5036]: I0110 17:07:55.041932 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" event={"ID":"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b","Type":"ContainerStarted","Data":"78396c084f2cd983c056438458c6658a615f8a975fd33a797b43740834f33094"} Jan 10 17:07:55 crc kubenswrapper[5036]: I0110 17:07:55.042552 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" event={"ID":"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b","Type":"ContainerStarted","Data":"77ee35b6b6671efdec0d161d1653e4c73ba587ab44837618c466e67b0e1c1b0e"} Jan 10 17:07:55 crc kubenswrapper[5036]: I0110 17:07:55.061817 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" podStartSLOduration=1.585089661 podStartE2EDuration="2.061792723s" podCreationTimestamp="2026-01-10 17:07:53 +0000 UTC" firstStartedPulling="2026-01-10 17:07:54.10176344 +0000 UTC m=+2395.971998974" lastFinishedPulling="2026-01-10 17:07:54.578466552 +0000 UTC m=+2396.448702036" observedRunningTime="2026-01-10 17:07:55.059170778 +0000 UTC m=+2396.929406292" watchObservedRunningTime="2026-01-10 17:07:55.061792723 +0000 UTC m=+2396.932028237" Jan 10 17:07:55 crc kubenswrapper[5036]: I0110 17:07:55.903881 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 17:07:55 crc kubenswrapper[5036]: I0110 17:07:55.903948 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 17:07:55 crc kubenswrapper[5036]: I0110 17:07:55.904007 5036 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 17:07:55 crc kubenswrapper[5036]: I0110 17:07:55.904859 5036 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af"} pod="openshift-machine-config-operator/machine-config-daemon-kqphb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 10 17:07:55 crc kubenswrapper[5036]: I0110 17:07:55.904919 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" containerID="cri-o://d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" gracePeriod=600 Jan 10 17:07:56 crc kubenswrapper[5036]: E0110 17:07:56.030446 5036 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79756361_741e_4470_831b_6ee092bc6277.slice/crio-conmon-d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79756361_741e_4470_831b_6ee092bc6277.slice/crio-d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af.scope\": RecentStats: unable to find data in memory cache]" Jan 10 17:07:56 crc kubenswrapper[5036]: E0110 17:07:56.041612 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:07:56 crc kubenswrapper[5036]: I0110 17:07:56.053409 5036 generic.go:334] "Generic (PLEG): container finished" podID="79756361-741e-4470-831b-6ee092bc6277" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" exitCode=0 Jan 10 17:07:56 crc kubenswrapper[5036]: I0110 17:07:56.053472 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerDied","Data":"d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af"} Jan 10 17:07:56 crc kubenswrapper[5036]: I0110 17:07:56.053530 5036 scope.go:117] "RemoveContainer" containerID="9109549e278f48da54c19e23f5b37bdb271c9f61a90632945b7ebb3b8d6064d5" Jan 10 17:07:56 crc kubenswrapper[5036]: I0110 17:07:56.054250 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:07:56 crc kubenswrapper[5036]: E0110 17:07:56.054466 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:08:08 crc kubenswrapper[5036]: I0110 17:08:08.522488 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:08:08 crc kubenswrapper[5036]: E0110 17:08:08.523448 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:08:19 crc kubenswrapper[5036]: I0110 17:08:19.507803 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:08:19 crc kubenswrapper[5036]: E0110 17:08:19.508579 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:08:34 crc kubenswrapper[5036]: I0110 17:08:34.508103 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:08:34 crc kubenswrapper[5036]: E0110 17:08:34.509147 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:08:48 crc kubenswrapper[5036]: I0110 17:08:48.518615 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:08:48 crc kubenswrapper[5036]: E0110 17:08:48.519591 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:09:01 crc kubenswrapper[5036]: I0110 17:09:01.508091 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:09:01 crc kubenswrapper[5036]: E0110 17:09:01.509299 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:09:13 crc kubenswrapper[5036]: I0110 17:09:13.508782 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:09:13 crc kubenswrapper[5036]: E0110 17:09:13.510173 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:09:26 crc kubenswrapper[5036]: I0110 17:09:26.508416 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:09:26 crc kubenswrapper[5036]: E0110 17:09:26.509725 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:09:38 crc kubenswrapper[5036]: I0110 17:09:38.515175 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:09:38 crc kubenswrapper[5036]: E0110 17:09:38.516377 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:09:51 crc kubenswrapper[5036]: I0110 17:09:51.508412 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:09:51 crc kubenswrapper[5036]: E0110 17:09:51.509564 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:10:06 crc kubenswrapper[5036]: I0110 17:10:06.508994 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:10:06 crc kubenswrapper[5036]: E0110 17:10:06.510001 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:10:21 crc kubenswrapper[5036]: I0110 17:10:21.508911 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:10:21 crc kubenswrapper[5036]: E0110 17:10:21.509789 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:10:26 crc kubenswrapper[5036]: I0110 17:10:26.745846 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-txfjp"] Jan 10 17:10:26 crc kubenswrapper[5036]: I0110 17:10:26.748988 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-txfjp" Jan 10 17:10:26 crc kubenswrapper[5036]: I0110 17:10:26.763776 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-txfjp"] Jan 10 17:10:26 crc kubenswrapper[5036]: I0110 17:10:26.888053 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b833ddbd-5207-4c6a-96ed-7ac63e0a0c83-utilities\") pod \"certified-operators-txfjp\" (UID: \"b833ddbd-5207-4c6a-96ed-7ac63e0a0c83\") " pod="openshift-marketplace/certified-operators-txfjp" Jan 10 17:10:26 crc kubenswrapper[5036]: I0110 17:10:26.888784 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b833ddbd-5207-4c6a-96ed-7ac63e0a0c83-catalog-content\") pod \"certified-operators-txfjp\" (UID: \"b833ddbd-5207-4c6a-96ed-7ac63e0a0c83\") " pod="openshift-marketplace/certified-operators-txfjp" Jan 10 17:10:26 crc kubenswrapper[5036]: I0110 17:10:26.888926 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5zcs\" (UniqueName: \"kubernetes.io/projected/b833ddbd-5207-4c6a-96ed-7ac63e0a0c83-kube-api-access-f5zcs\") pod \"certified-operators-txfjp\" (UID: \"b833ddbd-5207-4c6a-96ed-7ac63e0a0c83\") " pod="openshift-marketplace/certified-operators-txfjp" Jan 10 17:10:26 crc kubenswrapper[5036]: I0110 17:10:26.991184 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b833ddbd-5207-4c6a-96ed-7ac63e0a0c83-utilities\") pod \"certified-operators-txfjp\" (UID: \"b833ddbd-5207-4c6a-96ed-7ac63e0a0c83\") " pod="openshift-marketplace/certified-operators-txfjp" Jan 10 17:10:26 crc kubenswrapper[5036]: I0110 17:10:26.991259 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b833ddbd-5207-4c6a-96ed-7ac63e0a0c83-catalog-content\") pod \"certified-operators-txfjp\" (UID: \"b833ddbd-5207-4c6a-96ed-7ac63e0a0c83\") " pod="openshift-marketplace/certified-operators-txfjp" Jan 10 17:10:26 crc kubenswrapper[5036]: I0110 17:10:26.991361 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5zcs\" (UniqueName: \"kubernetes.io/projected/b833ddbd-5207-4c6a-96ed-7ac63e0a0c83-kube-api-access-f5zcs\") pod \"certified-operators-txfjp\" (UID: \"b833ddbd-5207-4c6a-96ed-7ac63e0a0c83\") " pod="openshift-marketplace/certified-operators-txfjp" Jan 10 17:10:26 crc kubenswrapper[5036]: I0110 17:10:26.992183 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b833ddbd-5207-4c6a-96ed-7ac63e0a0c83-utilities\") pod \"certified-operators-txfjp\" (UID: \"b833ddbd-5207-4c6a-96ed-7ac63e0a0c83\") " pod="openshift-marketplace/certified-operators-txfjp" Jan 10 17:10:26 crc kubenswrapper[5036]: I0110 17:10:26.992197 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b833ddbd-5207-4c6a-96ed-7ac63e0a0c83-catalog-content\") pod \"certified-operators-txfjp\" (UID: \"b833ddbd-5207-4c6a-96ed-7ac63e0a0c83\") " pod="openshift-marketplace/certified-operators-txfjp" Jan 10 17:10:27 crc kubenswrapper[5036]: I0110 17:10:27.012158 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5zcs\" (UniqueName: \"kubernetes.io/projected/b833ddbd-5207-4c6a-96ed-7ac63e0a0c83-kube-api-access-f5zcs\") pod \"certified-operators-txfjp\" (UID: \"b833ddbd-5207-4c6a-96ed-7ac63e0a0c83\") " pod="openshift-marketplace/certified-operators-txfjp" Jan 10 17:10:27 crc kubenswrapper[5036]: I0110 17:10:27.090375 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-txfjp" Jan 10 17:10:27 crc kubenswrapper[5036]: I0110 17:10:27.605005 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-txfjp"] Jan 10 17:10:28 crc kubenswrapper[5036]: I0110 17:10:28.637384 5036 generic.go:334] "Generic (PLEG): container finished" podID="b833ddbd-5207-4c6a-96ed-7ac63e0a0c83" containerID="5fcb83e104d4bc4112d354f6a30698a52fa370284e981eea3906be9ebd0ee93a" exitCode=0 Jan 10 17:10:28 crc kubenswrapper[5036]: I0110 17:10:28.637451 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txfjp" event={"ID":"b833ddbd-5207-4c6a-96ed-7ac63e0a0c83","Type":"ContainerDied","Data":"5fcb83e104d4bc4112d354f6a30698a52fa370284e981eea3906be9ebd0ee93a"} Jan 10 17:10:28 crc kubenswrapper[5036]: I0110 17:10:28.637850 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txfjp" event={"ID":"b833ddbd-5207-4c6a-96ed-7ac63e0a0c83","Type":"ContainerStarted","Data":"e1c5f41ae26d0165dcbdef266d0e40593fa05c6c4de3343fda74105293091d36"} Jan 10 17:10:28 crc kubenswrapper[5036]: I0110 17:10:28.640797 5036 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 10 17:10:30 crc kubenswrapper[5036]: I0110 17:10:30.709718 5036 generic.go:334] "Generic (PLEG): container finished" podID="b833ddbd-5207-4c6a-96ed-7ac63e0a0c83" containerID="1c70f7bbfd9c6a20c4008d86367ea19899f22bf8422fee836dd61085bf0fc63b" exitCode=0 Jan 10 17:10:30 crc kubenswrapper[5036]: I0110 17:10:30.709788 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txfjp" event={"ID":"b833ddbd-5207-4c6a-96ed-7ac63e0a0c83","Type":"ContainerDied","Data":"1c70f7bbfd9c6a20c4008d86367ea19899f22bf8422fee836dd61085bf0fc63b"} Jan 10 17:10:31 crc kubenswrapper[5036]: I0110 17:10:31.725606 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txfjp" event={"ID":"b833ddbd-5207-4c6a-96ed-7ac63e0a0c83","Type":"ContainerStarted","Data":"1fe7d8822d63e1b4f16edad55e59faaf497768e98dbde1f89ad4b6d05b90f96a"} Jan 10 17:10:31 crc kubenswrapper[5036]: I0110 17:10:31.758337 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-txfjp" podStartSLOduration=3.265340353 podStartE2EDuration="5.758311494s" podCreationTimestamp="2026-01-10 17:10:26 +0000 UTC" firstStartedPulling="2026-01-10 17:10:28.640175301 +0000 UTC m=+2550.510410835" lastFinishedPulling="2026-01-10 17:10:31.133146472 +0000 UTC m=+2553.003381976" observedRunningTime="2026-01-10 17:10:31.753887047 +0000 UTC m=+2553.624122611" watchObservedRunningTime="2026-01-10 17:10:31.758311494 +0000 UTC m=+2553.628546998" Jan 10 17:10:36 crc kubenswrapper[5036]: I0110 17:10:36.508267 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:10:36 crc kubenswrapper[5036]: E0110 17:10:36.511059 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:10:37 crc kubenswrapper[5036]: I0110 17:10:37.090605 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-txfjp" Jan 10 17:10:37 crc kubenswrapper[5036]: I0110 17:10:37.090732 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-txfjp" Jan 10 17:10:37 crc kubenswrapper[5036]: I0110 17:10:37.166132 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-txfjp" Jan 10 17:10:37 crc kubenswrapper[5036]: I0110 17:10:37.860147 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-txfjp" Jan 10 17:10:37 crc kubenswrapper[5036]: I0110 17:10:37.934919 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-txfjp"] Jan 10 17:10:39 crc kubenswrapper[5036]: I0110 17:10:39.806312 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-txfjp" podUID="b833ddbd-5207-4c6a-96ed-7ac63e0a0c83" containerName="registry-server" containerID="cri-o://1fe7d8822d63e1b4f16edad55e59faaf497768e98dbde1f89ad4b6d05b90f96a" gracePeriod=2 Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.352717 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-txfjp" Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.500012 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5zcs\" (UniqueName: \"kubernetes.io/projected/b833ddbd-5207-4c6a-96ed-7ac63e0a0c83-kube-api-access-f5zcs\") pod \"b833ddbd-5207-4c6a-96ed-7ac63e0a0c83\" (UID: \"b833ddbd-5207-4c6a-96ed-7ac63e0a0c83\") " Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.500066 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b833ddbd-5207-4c6a-96ed-7ac63e0a0c83-catalog-content\") pod \"b833ddbd-5207-4c6a-96ed-7ac63e0a0c83\" (UID: \"b833ddbd-5207-4c6a-96ed-7ac63e0a0c83\") " Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.500184 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b833ddbd-5207-4c6a-96ed-7ac63e0a0c83-utilities\") pod \"b833ddbd-5207-4c6a-96ed-7ac63e0a0c83\" (UID: \"b833ddbd-5207-4c6a-96ed-7ac63e0a0c83\") " Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.501933 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b833ddbd-5207-4c6a-96ed-7ac63e0a0c83-utilities" (OuterVolumeSpecName: "utilities") pod "b833ddbd-5207-4c6a-96ed-7ac63e0a0c83" (UID: "b833ddbd-5207-4c6a-96ed-7ac63e0a0c83"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.510987 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b833ddbd-5207-4c6a-96ed-7ac63e0a0c83-kube-api-access-f5zcs" (OuterVolumeSpecName: "kube-api-access-f5zcs") pod "b833ddbd-5207-4c6a-96ed-7ac63e0a0c83" (UID: "b833ddbd-5207-4c6a-96ed-7ac63e0a0c83"). InnerVolumeSpecName "kube-api-access-f5zcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.604612 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5zcs\" (UniqueName: \"kubernetes.io/projected/b833ddbd-5207-4c6a-96ed-7ac63e0a0c83-kube-api-access-f5zcs\") on node \"crc\" DevicePath \"\"" Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.604923 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b833ddbd-5207-4c6a-96ed-7ac63e0a0c83-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.689224 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b833ddbd-5207-4c6a-96ed-7ac63e0a0c83-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b833ddbd-5207-4c6a-96ed-7ac63e0a0c83" (UID: "b833ddbd-5207-4c6a-96ed-7ac63e0a0c83"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.706475 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b833ddbd-5207-4c6a-96ed-7ac63e0a0c83-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.817253 5036 generic.go:334] "Generic (PLEG): container finished" podID="b833ddbd-5207-4c6a-96ed-7ac63e0a0c83" containerID="1fe7d8822d63e1b4f16edad55e59faaf497768e98dbde1f89ad4b6d05b90f96a" exitCode=0 Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.817318 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txfjp" event={"ID":"b833ddbd-5207-4c6a-96ed-7ac63e0a0c83","Type":"ContainerDied","Data":"1fe7d8822d63e1b4f16edad55e59faaf497768e98dbde1f89ad4b6d05b90f96a"} Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.817371 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-txfjp" Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.817408 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txfjp" event={"ID":"b833ddbd-5207-4c6a-96ed-7ac63e0a0c83","Type":"ContainerDied","Data":"e1c5f41ae26d0165dcbdef266d0e40593fa05c6c4de3343fda74105293091d36"} Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.817439 5036 scope.go:117] "RemoveContainer" containerID="1fe7d8822d63e1b4f16edad55e59faaf497768e98dbde1f89ad4b6d05b90f96a" Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.849021 5036 scope.go:117] "RemoveContainer" containerID="1c70f7bbfd9c6a20c4008d86367ea19899f22bf8422fee836dd61085bf0fc63b" Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.878954 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-txfjp"] Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.896491 5036 scope.go:117] "RemoveContainer" containerID="5fcb83e104d4bc4112d354f6a30698a52fa370284e981eea3906be9ebd0ee93a" Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.902315 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-txfjp"] Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.922062 5036 scope.go:117] "RemoveContainer" containerID="1fe7d8822d63e1b4f16edad55e59faaf497768e98dbde1f89ad4b6d05b90f96a" Jan 10 17:10:40 crc kubenswrapper[5036]: E0110 17:10:40.922416 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fe7d8822d63e1b4f16edad55e59faaf497768e98dbde1f89ad4b6d05b90f96a\": container with ID starting with 1fe7d8822d63e1b4f16edad55e59faaf497768e98dbde1f89ad4b6d05b90f96a not found: ID does not exist" containerID="1fe7d8822d63e1b4f16edad55e59faaf497768e98dbde1f89ad4b6d05b90f96a" Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.922448 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fe7d8822d63e1b4f16edad55e59faaf497768e98dbde1f89ad4b6d05b90f96a"} err="failed to get container status \"1fe7d8822d63e1b4f16edad55e59faaf497768e98dbde1f89ad4b6d05b90f96a\": rpc error: code = NotFound desc = could not find container \"1fe7d8822d63e1b4f16edad55e59faaf497768e98dbde1f89ad4b6d05b90f96a\": container with ID starting with 1fe7d8822d63e1b4f16edad55e59faaf497768e98dbde1f89ad4b6d05b90f96a not found: ID does not exist" Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.922469 5036 scope.go:117] "RemoveContainer" containerID="1c70f7bbfd9c6a20c4008d86367ea19899f22bf8422fee836dd61085bf0fc63b" Jan 10 17:10:40 crc kubenswrapper[5036]: E0110 17:10:40.922744 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c70f7bbfd9c6a20c4008d86367ea19899f22bf8422fee836dd61085bf0fc63b\": container with ID starting with 1c70f7bbfd9c6a20c4008d86367ea19899f22bf8422fee836dd61085bf0fc63b not found: ID does not exist" containerID="1c70f7bbfd9c6a20c4008d86367ea19899f22bf8422fee836dd61085bf0fc63b" Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.922789 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c70f7bbfd9c6a20c4008d86367ea19899f22bf8422fee836dd61085bf0fc63b"} err="failed to get container status \"1c70f7bbfd9c6a20c4008d86367ea19899f22bf8422fee836dd61085bf0fc63b\": rpc error: code = NotFound desc = could not find container \"1c70f7bbfd9c6a20c4008d86367ea19899f22bf8422fee836dd61085bf0fc63b\": container with ID starting with 1c70f7bbfd9c6a20c4008d86367ea19899f22bf8422fee836dd61085bf0fc63b not found: ID does not exist" Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.922817 5036 scope.go:117] "RemoveContainer" containerID="5fcb83e104d4bc4112d354f6a30698a52fa370284e981eea3906be9ebd0ee93a" Jan 10 17:10:40 crc kubenswrapper[5036]: E0110 17:10:40.923107 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fcb83e104d4bc4112d354f6a30698a52fa370284e981eea3906be9ebd0ee93a\": container with ID starting with 5fcb83e104d4bc4112d354f6a30698a52fa370284e981eea3906be9ebd0ee93a not found: ID does not exist" containerID="5fcb83e104d4bc4112d354f6a30698a52fa370284e981eea3906be9ebd0ee93a" Jan 10 17:10:40 crc kubenswrapper[5036]: I0110 17:10:40.923128 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fcb83e104d4bc4112d354f6a30698a52fa370284e981eea3906be9ebd0ee93a"} err="failed to get container status \"5fcb83e104d4bc4112d354f6a30698a52fa370284e981eea3906be9ebd0ee93a\": rpc error: code = NotFound desc = could not find container \"5fcb83e104d4bc4112d354f6a30698a52fa370284e981eea3906be9ebd0ee93a\": container with ID starting with 5fcb83e104d4bc4112d354f6a30698a52fa370284e981eea3906be9ebd0ee93a not found: ID does not exist" Jan 10 17:10:42 crc kubenswrapper[5036]: I0110 17:10:42.525753 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b833ddbd-5207-4c6a-96ed-7ac63e0a0c83" path="/var/lib/kubelet/pods/b833ddbd-5207-4c6a-96ed-7ac63e0a0c83/volumes" Jan 10 17:10:50 crc kubenswrapper[5036]: I0110 17:10:50.508157 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:10:50 crc kubenswrapper[5036]: E0110 17:10:50.509188 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:11:04 crc kubenswrapper[5036]: I0110 17:11:04.508612 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:11:04 crc kubenswrapper[5036]: E0110 17:11:04.510560 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:11:17 crc kubenswrapper[5036]: I0110 17:11:17.508005 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:11:17 crc kubenswrapper[5036]: E0110 17:11:17.509011 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:11:28 crc kubenswrapper[5036]: I0110 17:11:28.512915 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:11:28 crc kubenswrapper[5036]: E0110 17:11:28.513629 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:11:42 crc kubenswrapper[5036]: I0110 17:11:42.509578 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:11:42 crc kubenswrapper[5036]: E0110 17:11:42.510595 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:11:54 crc kubenswrapper[5036]: I0110 17:11:54.508339 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:11:54 crc kubenswrapper[5036]: E0110 17:11:54.509322 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:12:06 crc kubenswrapper[5036]: I0110 17:12:06.508543 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:12:06 crc kubenswrapper[5036]: E0110 17:12:06.509274 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:12:19 crc kubenswrapper[5036]: I0110 17:12:19.508499 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:12:19 crc kubenswrapper[5036]: E0110 17:12:19.509576 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:12:30 crc kubenswrapper[5036]: I0110 17:12:30.509476 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:12:30 crc kubenswrapper[5036]: E0110 17:12:30.510927 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:12:43 crc kubenswrapper[5036]: I0110 17:12:43.508369 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:12:43 crc kubenswrapper[5036]: E0110 17:12:43.509591 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:12:46 crc kubenswrapper[5036]: I0110 17:12:46.220267 5036 generic.go:334] "Generic (PLEG): container finished" podID="b0c29b9c-0e82-4bbc-89af-fa26d3c4603b" containerID="78396c084f2cd983c056438458c6658a615f8a975fd33a797b43740834f33094" exitCode=0 Jan 10 17:12:46 crc kubenswrapper[5036]: I0110 17:12:46.220376 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" event={"ID":"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b","Type":"ContainerDied","Data":"78396c084f2cd983c056438458c6658a615f8a975fd33a797b43740834f33094"} Jan 10 17:12:47 crc kubenswrapper[5036]: I0110 17:12:47.805870 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" Jan 10 17:12:47 crc kubenswrapper[5036]: I0110 17:12:47.917527 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-ceph\") pod \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\" (UID: \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\") " Jan 10 17:12:47 crc kubenswrapper[5036]: I0110 17:12:47.917765 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-libvirt-secret-0\") pod \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\" (UID: \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\") " Jan 10 17:12:47 crc kubenswrapper[5036]: I0110 17:12:47.917856 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-inventory\") pod \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\" (UID: \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\") " Jan 10 17:12:47 crc kubenswrapper[5036]: I0110 17:12:47.917930 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrkv8\" (UniqueName: \"kubernetes.io/projected/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-kube-api-access-rrkv8\") pod \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\" (UID: \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\") " Jan 10 17:12:47 crc kubenswrapper[5036]: I0110 17:12:47.917973 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-ssh-key-openstack-edpm-ipam\") pod \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\" (UID: \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\") " Jan 10 17:12:47 crc kubenswrapper[5036]: I0110 17:12:47.918017 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-libvirt-combined-ca-bundle\") pod \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\" (UID: \"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b\") " Jan 10 17:12:47 crc kubenswrapper[5036]: I0110 17:12:47.927775 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "b0c29b9c-0e82-4bbc-89af-fa26d3c4603b" (UID: "b0c29b9c-0e82-4bbc-89af-fa26d3c4603b"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:12:47 crc kubenswrapper[5036]: I0110 17:12:47.932851 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-ceph" (OuterVolumeSpecName: "ceph") pod "b0c29b9c-0e82-4bbc-89af-fa26d3c4603b" (UID: "b0c29b9c-0e82-4bbc-89af-fa26d3c4603b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:12:47 crc kubenswrapper[5036]: I0110 17:12:47.933643 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-kube-api-access-rrkv8" (OuterVolumeSpecName: "kube-api-access-rrkv8") pod "b0c29b9c-0e82-4bbc-89af-fa26d3c4603b" (UID: "b0c29b9c-0e82-4bbc-89af-fa26d3c4603b"). InnerVolumeSpecName "kube-api-access-rrkv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:12:47 crc kubenswrapper[5036]: I0110 17:12:47.971406 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "b0c29b9c-0e82-4bbc-89af-fa26d3c4603b" (UID: "b0c29b9c-0e82-4bbc-89af-fa26d3c4603b"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:12:47 crc kubenswrapper[5036]: I0110 17:12:47.975961 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-inventory" (OuterVolumeSpecName: "inventory") pod "b0c29b9c-0e82-4bbc-89af-fa26d3c4603b" (UID: "b0c29b9c-0e82-4bbc-89af-fa26d3c4603b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:12:47 crc kubenswrapper[5036]: I0110 17:12:47.977384 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b0c29b9c-0e82-4bbc-89af-fa26d3c4603b" (UID: "b0c29b9c-0e82-4bbc-89af-fa26d3c4603b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.020696 5036 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-inventory\") on node \"crc\" DevicePath \"\"" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.020733 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrkv8\" (UniqueName: \"kubernetes.io/projected/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-kube-api-access-rrkv8\") on node \"crc\" DevicePath \"\"" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.020749 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.020760 5036 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.020771 5036 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-ceph\") on node \"crc\" DevicePath \"\"" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.020782 5036 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0c29b9c-0e82-4bbc-89af-fa26d3c4603b-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.247980 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" event={"ID":"b0c29b9c-0e82-4bbc-89af-fa26d3c4603b","Type":"ContainerDied","Data":"77ee35b6b6671efdec0d161d1653e4c73ba587ab44837618c466e67b0e1c1b0e"} Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.248031 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77ee35b6b6671efdec0d161d1653e4c73ba587ab44837618c466e67b0e1c1b0e" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.248082 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.369528 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl"] Jan 10 17:12:48 crc kubenswrapper[5036]: E0110 17:12:48.369978 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b833ddbd-5207-4c6a-96ed-7ac63e0a0c83" containerName="extract-content" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.369998 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="b833ddbd-5207-4c6a-96ed-7ac63e0a0c83" containerName="extract-content" Jan 10 17:12:48 crc kubenswrapper[5036]: E0110 17:12:48.370032 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b833ddbd-5207-4c6a-96ed-7ac63e0a0c83" containerName="registry-server" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.370041 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="b833ddbd-5207-4c6a-96ed-7ac63e0a0c83" containerName="registry-server" Jan 10 17:12:48 crc kubenswrapper[5036]: E0110 17:12:48.370069 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0c29b9c-0e82-4bbc-89af-fa26d3c4603b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.370079 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0c29b9c-0e82-4bbc-89af-fa26d3c4603b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 10 17:12:48 crc kubenswrapper[5036]: E0110 17:12:48.370102 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b833ddbd-5207-4c6a-96ed-7ac63e0a0c83" containerName="extract-utilities" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.370110 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="b833ddbd-5207-4c6a-96ed-7ac63e0a0c83" containerName="extract-utilities" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.370337 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0c29b9c-0e82-4bbc-89af-fa26d3c4603b" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.370355 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="b833ddbd-5207-4c6a-96ed-7ac63e0a0c83" containerName="registry-server" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.371136 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.374103 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.374282 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.374625 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.375071 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.375855 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.376187 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.376405 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-thwrl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.376630 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.377021 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.393543 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl"] Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.530666 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9wwk\" (UniqueName: \"kubernetes.io/projected/b4da8068-8e5a-4624-b65f-05da63640d19-kube-api-access-q9wwk\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.530781 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.530929 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.530987 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b4da8068-8e5a-4624-b65f-05da63640d19-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.531033 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.531108 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.531178 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.531436 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.531577 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.531622 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.531647 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b4da8068-8e5a-4624-b65f-05da63640d19-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.633722 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.633785 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.633815 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.633845 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b4da8068-8e5a-4624-b65f-05da63640d19-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.633912 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9wwk\" (UniqueName: \"kubernetes.io/projected/b4da8068-8e5a-4624-b65f-05da63640d19-kube-api-access-q9wwk\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.633971 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.634095 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.634132 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b4da8068-8e5a-4624-b65f-05da63640d19-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.634156 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.634212 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.634251 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.637316 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b4da8068-8e5a-4624-b65f-05da63640d19-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.637475 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b4da8068-8e5a-4624-b65f-05da63640d19-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.641670 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.641895 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.641944 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.645799 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.648270 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.649253 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.649437 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.650001 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.653238 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9wwk\" (UniqueName: \"kubernetes.io/projected/b4da8068-8e5a-4624-b65f-05da63640d19-kube-api-access-q9wwk\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:48 crc kubenswrapper[5036]: I0110 17:12:48.700901 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:12:49 crc kubenswrapper[5036]: I0110 17:12:49.280156 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl"] Jan 10 17:12:50 crc kubenswrapper[5036]: I0110 17:12:50.271175 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" event={"ID":"b4da8068-8e5a-4624-b65f-05da63640d19","Type":"ContainerStarted","Data":"8d94edafb3070d8cbff75daa7bb654cb8dc219fba1f8a3eb94a024d894c59e59"} Jan 10 17:12:50 crc kubenswrapper[5036]: I0110 17:12:50.271528 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" event={"ID":"b4da8068-8e5a-4624-b65f-05da63640d19","Type":"ContainerStarted","Data":"dd25e6eed2185244c3266f80de4527e19b59033887b482420b7e91eff48e7765"} Jan 10 17:12:57 crc kubenswrapper[5036]: I0110 17:12:57.508132 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:12:58 crc kubenswrapper[5036]: I0110 17:12:58.342127 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerStarted","Data":"e1e211d00f0a3d2cccd996d6fd957c8fef52f7908e7b7faa418a6b65ea4298f3"} Jan 10 17:12:58 crc kubenswrapper[5036]: I0110 17:12:58.376151 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" podStartSLOduration=9.875670486 podStartE2EDuration="10.376126207s" podCreationTimestamp="2026-01-10 17:12:48 +0000 UTC" firstStartedPulling="2026-01-10 17:12:49.296002173 +0000 UTC m=+2691.166237707" lastFinishedPulling="2026-01-10 17:12:49.796457894 +0000 UTC m=+2691.666693428" observedRunningTime="2026-01-10 17:12:50.294242689 +0000 UTC m=+2692.164478193" watchObservedRunningTime="2026-01-10 17:12:58.376126207 +0000 UTC m=+2700.246361741" Jan 10 17:13:38 crc kubenswrapper[5036]: I0110 17:13:38.398413 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xxh7m"] Jan 10 17:13:38 crc kubenswrapper[5036]: I0110 17:13:38.403968 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xxh7m" Jan 10 17:13:38 crc kubenswrapper[5036]: I0110 17:13:38.408745 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxh7m"] Jan 10 17:13:38 crc kubenswrapper[5036]: I0110 17:13:38.533485 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50a2a5d5-91e9-4cdd-8864-8001311a73bd-utilities\") pod \"redhat-marketplace-xxh7m\" (UID: \"50a2a5d5-91e9-4cdd-8864-8001311a73bd\") " pod="openshift-marketplace/redhat-marketplace-xxh7m" Jan 10 17:13:38 crc kubenswrapper[5036]: I0110 17:13:38.534179 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww6mk\" (UniqueName: \"kubernetes.io/projected/50a2a5d5-91e9-4cdd-8864-8001311a73bd-kube-api-access-ww6mk\") pod \"redhat-marketplace-xxh7m\" (UID: \"50a2a5d5-91e9-4cdd-8864-8001311a73bd\") " pod="openshift-marketplace/redhat-marketplace-xxh7m" Jan 10 17:13:38 crc kubenswrapper[5036]: I0110 17:13:38.534217 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50a2a5d5-91e9-4cdd-8864-8001311a73bd-catalog-content\") pod \"redhat-marketplace-xxh7m\" (UID: \"50a2a5d5-91e9-4cdd-8864-8001311a73bd\") " pod="openshift-marketplace/redhat-marketplace-xxh7m" Jan 10 17:13:38 crc kubenswrapper[5036]: I0110 17:13:38.635883 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww6mk\" (UniqueName: \"kubernetes.io/projected/50a2a5d5-91e9-4cdd-8864-8001311a73bd-kube-api-access-ww6mk\") pod \"redhat-marketplace-xxh7m\" (UID: \"50a2a5d5-91e9-4cdd-8864-8001311a73bd\") " pod="openshift-marketplace/redhat-marketplace-xxh7m" Jan 10 17:13:38 crc kubenswrapper[5036]: I0110 17:13:38.635939 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50a2a5d5-91e9-4cdd-8864-8001311a73bd-catalog-content\") pod \"redhat-marketplace-xxh7m\" (UID: \"50a2a5d5-91e9-4cdd-8864-8001311a73bd\") " pod="openshift-marketplace/redhat-marketplace-xxh7m" Jan 10 17:13:38 crc kubenswrapper[5036]: I0110 17:13:38.636718 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50a2a5d5-91e9-4cdd-8864-8001311a73bd-catalog-content\") pod \"redhat-marketplace-xxh7m\" (UID: \"50a2a5d5-91e9-4cdd-8864-8001311a73bd\") " pod="openshift-marketplace/redhat-marketplace-xxh7m" Jan 10 17:13:38 crc kubenswrapper[5036]: I0110 17:13:38.637029 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50a2a5d5-91e9-4cdd-8864-8001311a73bd-utilities\") pod \"redhat-marketplace-xxh7m\" (UID: \"50a2a5d5-91e9-4cdd-8864-8001311a73bd\") " pod="openshift-marketplace/redhat-marketplace-xxh7m" Jan 10 17:13:38 crc kubenswrapper[5036]: I0110 17:13:38.637191 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50a2a5d5-91e9-4cdd-8864-8001311a73bd-utilities\") pod \"redhat-marketplace-xxh7m\" (UID: \"50a2a5d5-91e9-4cdd-8864-8001311a73bd\") " pod="openshift-marketplace/redhat-marketplace-xxh7m" Jan 10 17:13:38 crc kubenswrapper[5036]: I0110 17:13:38.661168 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww6mk\" (UniqueName: \"kubernetes.io/projected/50a2a5d5-91e9-4cdd-8864-8001311a73bd-kube-api-access-ww6mk\") pod \"redhat-marketplace-xxh7m\" (UID: \"50a2a5d5-91e9-4cdd-8864-8001311a73bd\") " pod="openshift-marketplace/redhat-marketplace-xxh7m" Jan 10 17:13:38 crc kubenswrapper[5036]: I0110 17:13:38.735851 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xxh7m" Jan 10 17:13:39 crc kubenswrapper[5036]: I0110 17:13:39.242395 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxh7m"] Jan 10 17:13:39 crc kubenswrapper[5036]: I0110 17:13:39.802311 5036 generic.go:334] "Generic (PLEG): container finished" podID="50a2a5d5-91e9-4cdd-8864-8001311a73bd" containerID="2f7bc1839304dbe923712b38ef17b339d1268c25322dd931046f2552a77b1fd4" exitCode=0 Jan 10 17:13:39 crc kubenswrapper[5036]: I0110 17:13:39.802359 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxh7m" event={"ID":"50a2a5d5-91e9-4cdd-8864-8001311a73bd","Type":"ContainerDied","Data":"2f7bc1839304dbe923712b38ef17b339d1268c25322dd931046f2552a77b1fd4"} Jan 10 17:13:39 crc kubenswrapper[5036]: I0110 17:13:39.802387 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxh7m" event={"ID":"50a2a5d5-91e9-4cdd-8864-8001311a73bd","Type":"ContainerStarted","Data":"3e697a664ea21ffb0bf97b42a42c2e415f9160386bf917547610931a6bc8ae16"} Jan 10 17:13:40 crc kubenswrapper[5036]: I0110 17:13:40.580326 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7kmmm"] Jan 10 17:13:40 crc kubenswrapper[5036]: I0110 17:13:40.584513 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7kmmm" Jan 10 17:13:40 crc kubenswrapper[5036]: I0110 17:13:40.596657 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7kmmm"] Jan 10 17:13:40 crc kubenswrapper[5036]: I0110 17:13:40.776756 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65cx7\" (UniqueName: \"kubernetes.io/projected/0a56bf21-da14-4f3b-af16-d8390bfe5384-kube-api-access-65cx7\") pod \"redhat-operators-7kmmm\" (UID: \"0a56bf21-da14-4f3b-af16-d8390bfe5384\") " pod="openshift-marketplace/redhat-operators-7kmmm" Jan 10 17:13:40 crc kubenswrapper[5036]: I0110 17:13:40.776975 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a56bf21-da14-4f3b-af16-d8390bfe5384-utilities\") pod \"redhat-operators-7kmmm\" (UID: \"0a56bf21-da14-4f3b-af16-d8390bfe5384\") " pod="openshift-marketplace/redhat-operators-7kmmm" Jan 10 17:13:40 crc kubenswrapper[5036]: I0110 17:13:40.777029 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a56bf21-da14-4f3b-af16-d8390bfe5384-catalog-content\") pod \"redhat-operators-7kmmm\" (UID: \"0a56bf21-da14-4f3b-af16-d8390bfe5384\") " pod="openshift-marketplace/redhat-operators-7kmmm" Jan 10 17:13:40 crc kubenswrapper[5036]: I0110 17:13:40.812347 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxh7m" event={"ID":"50a2a5d5-91e9-4cdd-8864-8001311a73bd","Type":"ContainerStarted","Data":"366057db15cfaef0243e9086dc25a91072b054727e53ca13f55c45b33b919bfa"} Jan 10 17:13:40 crc kubenswrapper[5036]: I0110 17:13:40.878891 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65cx7\" (UniqueName: \"kubernetes.io/projected/0a56bf21-da14-4f3b-af16-d8390bfe5384-kube-api-access-65cx7\") pod \"redhat-operators-7kmmm\" (UID: \"0a56bf21-da14-4f3b-af16-d8390bfe5384\") " pod="openshift-marketplace/redhat-operators-7kmmm" Jan 10 17:13:40 crc kubenswrapper[5036]: I0110 17:13:40.878979 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a56bf21-da14-4f3b-af16-d8390bfe5384-utilities\") pod \"redhat-operators-7kmmm\" (UID: \"0a56bf21-da14-4f3b-af16-d8390bfe5384\") " pod="openshift-marketplace/redhat-operators-7kmmm" Jan 10 17:13:40 crc kubenswrapper[5036]: I0110 17:13:40.879021 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a56bf21-da14-4f3b-af16-d8390bfe5384-catalog-content\") pod \"redhat-operators-7kmmm\" (UID: \"0a56bf21-da14-4f3b-af16-d8390bfe5384\") " pod="openshift-marketplace/redhat-operators-7kmmm" Jan 10 17:13:40 crc kubenswrapper[5036]: I0110 17:13:40.879576 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a56bf21-da14-4f3b-af16-d8390bfe5384-catalog-content\") pod \"redhat-operators-7kmmm\" (UID: \"0a56bf21-da14-4f3b-af16-d8390bfe5384\") " pod="openshift-marketplace/redhat-operators-7kmmm" Jan 10 17:13:40 crc kubenswrapper[5036]: I0110 17:13:40.879599 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a56bf21-da14-4f3b-af16-d8390bfe5384-utilities\") pod \"redhat-operators-7kmmm\" (UID: \"0a56bf21-da14-4f3b-af16-d8390bfe5384\") " pod="openshift-marketplace/redhat-operators-7kmmm" Jan 10 17:13:40 crc kubenswrapper[5036]: I0110 17:13:40.909750 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65cx7\" (UniqueName: \"kubernetes.io/projected/0a56bf21-da14-4f3b-af16-d8390bfe5384-kube-api-access-65cx7\") pod \"redhat-operators-7kmmm\" (UID: \"0a56bf21-da14-4f3b-af16-d8390bfe5384\") " pod="openshift-marketplace/redhat-operators-7kmmm" Jan 10 17:13:40 crc kubenswrapper[5036]: I0110 17:13:40.930787 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7kmmm" Jan 10 17:13:41 crc kubenswrapper[5036]: I0110 17:13:41.398298 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7kmmm"] Jan 10 17:13:41 crc kubenswrapper[5036]: I0110 17:13:41.822394 5036 generic.go:334] "Generic (PLEG): container finished" podID="0a56bf21-da14-4f3b-af16-d8390bfe5384" containerID="ad10c9cf13abedc26028cac763cb1d599cad0c7dd17fade003293df98e452038" exitCode=0 Jan 10 17:13:41 crc kubenswrapper[5036]: I0110 17:13:41.822477 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7kmmm" event={"ID":"0a56bf21-da14-4f3b-af16-d8390bfe5384","Type":"ContainerDied","Data":"ad10c9cf13abedc26028cac763cb1d599cad0c7dd17fade003293df98e452038"} Jan 10 17:13:41 crc kubenswrapper[5036]: I0110 17:13:41.822517 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7kmmm" event={"ID":"0a56bf21-da14-4f3b-af16-d8390bfe5384","Type":"ContainerStarted","Data":"4e20fbb44bd1ec3b2467e6b7617c815acef90de2365fe2110aaa06927cc03ecf"} Jan 10 17:13:41 crc kubenswrapper[5036]: I0110 17:13:41.825209 5036 generic.go:334] "Generic (PLEG): container finished" podID="50a2a5d5-91e9-4cdd-8864-8001311a73bd" containerID="366057db15cfaef0243e9086dc25a91072b054727e53ca13f55c45b33b919bfa" exitCode=0 Jan 10 17:13:41 crc kubenswrapper[5036]: I0110 17:13:41.825248 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxh7m" event={"ID":"50a2a5d5-91e9-4cdd-8864-8001311a73bd","Type":"ContainerDied","Data":"366057db15cfaef0243e9086dc25a91072b054727e53ca13f55c45b33b919bfa"} Jan 10 17:13:42 crc kubenswrapper[5036]: I0110 17:13:42.836089 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxh7m" event={"ID":"50a2a5d5-91e9-4cdd-8864-8001311a73bd","Type":"ContainerStarted","Data":"69e71571599b54d94804cb296058fcbc860a6733d651057796288d432d460e8b"} Jan 10 17:13:42 crc kubenswrapper[5036]: I0110 17:13:42.863027 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xxh7m" podStartSLOduration=2.332633949 podStartE2EDuration="4.863006357s" podCreationTimestamp="2026-01-10 17:13:38 +0000 UTC" firstStartedPulling="2026-01-10 17:13:39.805541937 +0000 UTC m=+2741.675777441" lastFinishedPulling="2026-01-10 17:13:42.335914355 +0000 UTC m=+2744.206149849" observedRunningTime="2026-01-10 17:13:42.856428339 +0000 UTC m=+2744.726663853" watchObservedRunningTime="2026-01-10 17:13:42.863006357 +0000 UTC m=+2744.733241851" Jan 10 17:13:43 crc kubenswrapper[5036]: I0110 17:13:43.846124 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7kmmm" event={"ID":"0a56bf21-da14-4f3b-af16-d8390bfe5384","Type":"ContainerStarted","Data":"876065ce7d4c91a44ea40b0e3f1e2b43bdb6bd71f6c29252a62e09c8c6437ae5"} Jan 10 17:13:45 crc kubenswrapper[5036]: I0110 17:13:45.876355 5036 generic.go:334] "Generic (PLEG): container finished" podID="0a56bf21-da14-4f3b-af16-d8390bfe5384" containerID="876065ce7d4c91a44ea40b0e3f1e2b43bdb6bd71f6c29252a62e09c8c6437ae5" exitCode=0 Jan 10 17:13:45 crc kubenswrapper[5036]: I0110 17:13:45.876461 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7kmmm" event={"ID":"0a56bf21-da14-4f3b-af16-d8390bfe5384","Type":"ContainerDied","Data":"876065ce7d4c91a44ea40b0e3f1e2b43bdb6bd71f6c29252a62e09c8c6437ae5"} Jan 10 17:13:47 crc kubenswrapper[5036]: I0110 17:13:47.900344 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7kmmm" event={"ID":"0a56bf21-da14-4f3b-af16-d8390bfe5384","Type":"ContainerStarted","Data":"8c07308ac8098768be754b654d2377d9686b5eac418dbf7e8ffe479049c24ed7"} Jan 10 17:13:47 crc kubenswrapper[5036]: I0110 17:13:47.954963 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7kmmm" podStartSLOduration=2.617219358 podStartE2EDuration="7.954939174s" podCreationTimestamp="2026-01-10 17:13:40 +0000 UTC" firstStartedPulling="2026-01-10 17:13:41.824221323 +0000 UTC m=+2743.694456817" lastFinishedPulling="2026-01-10 17:13:47.161941099 +0000 UTC m=+2749.032176633" observedRunningTime="2026-01-10 17:13:47.938082623 +0000 UTC m=+2749.808318127" watchObservedRunningTime="2026-01-10 17:13:47.954939174 +0000 UTC m=+2749.825174678" Jan 10 17:13:48 crc kubenswrapper[5036]: I0110 17:13:48.736638 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xxh7m" Jan 10 17:13:48 crc kubenswrapper[5036]: I0110 17:13:48.738134 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xxh7m" Jan 10 17:13:48 crc kubenswrapper[5036]: I0110 17:13:48.798156 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xxh7m" Jan 10 17:13:48 crc kubenswrapper[5036]: I0110 17:13:48.956071 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xxh7m" Jan 10 17:13:49 crc kubenswrapper[5036]: I0110 17:13:49.772250 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxh7m"] Jan 10 17:13:50 crc kubenswrapper[5036]: I0110 17:13:50.923088 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xxh7m" podUID="50a2a5d5-91e9-4cdd-8864-8001311a73bd" containerName="registry-server" containerID="cri-o://69e71571599b54d94804cb296058fcbc860a6733d651057796288d432d460e8b" gracePeriod=2 Jan 10 17:13:50 crc kubenswrapper[5036]: I0110 17:13:50.931873 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7kmmm" Jan 10 17:13:50 crc kubenswrapper[5036]: I0110 17:13:50.931904 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7kmmm" Jan 10 17:13:51 crc kubenswrapper[5036]: I0110 17:13:51.365389 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xxh7m" Jan 10 17:13:51 crc kubenswrapper[5036]: I0110 17:13:51.477381 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50a2a5d5-91e9-4cdd-8864-8001311a73bd-utilities\") pod \"50a2a5d5-91e9-4cdd-8864-8001311a73bd\" (UID: \"50a2a5d5-91e9-4cdd-8864-8001311a73bd\") " Jan 10 17:13:51 crc kubenswrapper[5036]: I0110 17:13:51.477435 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50a2a5d5-91e9-4cdd-8864-8001311a73bd-catalog-content\") pod \"50a2a5d5-91e9-4cdd-8864-8001311a73bd\" (UID: \"50a2a5d5-91e9-4cdd-8864-8001311a73bd\") " Jan 10 17:13:51 crc kubenswrapper[5036]: I0110 17:13:51.477576 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww6mk\" (UniqueName: \"kubernetes.io/projected/50a2a5d5-91e9-4cdd-8864-8001311a73bd-kube-api-access-ww6mk\") pod \"50a2a5d5-91e9-4cdd-8864-8001311a73bd\" (UID: \"50a2a5d5-91e9-4cdd-8864-8001311a73bd\") " Jan 10 17:13:51 crc kubenswrapper[5036]: I0110 17:13:51.478156 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50a2a5d5-91e9-4cdd-8864-8001311a73bd-utilities" (OuterVolumeSpecName: "utilities") pod "50a2a5d5-91e9-4cdd-8864-8001311a73bd" (UID: "50a2a5d5-91e9-4cdd-8864-8001311a73bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:13:51 crc kubenswrapper[5036]: I0110 17:13:51.484121 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50a2a5d5-91e9-4cdd-8864-8001311a73bd-kube-api-access-ww6mk" (OuterVolumeSpecName: "kube-api-access-ww6mk") pod "50a2a5d5-91e9-4cdd-8864-8001311a73bd" (UID: "50a2a5d5-91e9-4cdd-8864-8001311a73bd"). InnerVolumeSpecName "kube-api-access-ww6mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:13:51 crc kubenswrapper[5036]: I0110 17:13:51.506083 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50a2a5d5-91e9-4cdd-8864-8001311a73bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50a2a5d5-91e9-4cdd-8864-8001311a73bd" (UID: "50a2a5d5-91e9-4cdd-8864-8001311a73bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:13:51 crc kubenswrapper[5036]: I0110 17:13:51.580088 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50a2a5d5-91e9-4cdd-8864-8001311a73bd-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 17:13:51 crc kubenswrapper[5036]: I0110 17:13:51.580128 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50a2a5d5-91e9-4cdd-8864-8001311a73bd-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 17:13:51 crc kubenswrapper[5036]: I0110 17:13:51.580144 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww6mk\" (UniqueName: \"kubernetes.io/projected/50a2a5d5-91e9-4cdd-8864-8001311a73bd-kube-api-access-ww6mk\") on node \"crc\" DevicePath \"\"" Jan 10 17:13:51 crc kubenswrapper[5036]: I0110 17:13:51.934428 5036 generic.go:334] "Generic (PLEG): container finished" podID="50a2a5d5-91e9-4cdd-8864-8001311a73bd" containerID="69e71571599b54d94804cb296058fcbc860a6733d651057796288d432d460e8b" exitCode=0 Jan 10 17:13:51 crc kubenswrapper[5036]: I0110 17:13:51.934616 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxh7m" event={"ID":"50a2a5d5-91e9-4cdd-8864-8001311a73bd","Type":"ContainerDied","Data":"69e71571599b54d94804cb296058fcbc860a6733d651057796288d432d460e8b"} Jan 10 17:13:51 crc kubenswrapper[5036]: I0110 17:13:51.934861 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xxh7m" event={"ID":"50a2a5d5-91e9-4cdd-8864-8001311a73bd","Type":"ContainerDied","Data":"3e697a664ea21ffb0bf97b42a42c2e415f9160386bf917547610931a6bc8ae16"} Jan 10 17:13:51 crc kubenswrapper[5036]: I0110 17:13:51.934896 5036 scope.go:117] "RemoveContainer" containerID="69e71571599b54d94804cb296058fcbc860a6733d651057796288d432d460e8b" Jan 10 17:13:51 crc kubenswrapper[5036]: I0110 17:13:51.934712 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xxh7m" Jan 10 17:13:51 crc kubenswrapper[5036]: I0110 17:13:51.957306 5036 scope.go:117] "RemoveContainer" containerID="366057db15cfaef0243e9086dc25a91072b054727e53ca13f55c45b33b919bfa" Jan 10 17:13:51 crc kubenswrapper[5036]: I0110 17:13:51.981922 5036 scope.go:117] "RemoveContainer" containerID="2f7bc1839304dbe923712b38ef17b339d1268c25322dd931046f2552a77b1fd4" Jan 10 17:13:51 crc kubenswrapper[5036]: I0110 17:13:51.991105 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxh7m"] Jan 10 17:13:51 crc kubenswrapper[5036]: I0110 17:13:51.993503 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7kmmm" podUID="0a56bf21-da14-4f3b-af16-d8390bfe5384" containerName="registry-server" probeResult="failure" output=< Jan 10 17:13:51 crc kubenswrapper[5036]: timeout: failed to connect service ":50051" within 1s Jan 10 17:13:51 crc kubenswrapper[5036]: > Jan 10 17:13:52 crc kubenswrapper[5036]: I0110 17:13:52.007058 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xxh7m"] Jan 10 17:13:52 crc kubenswrapper[5036]: I0110 17:13:52.022821 5036 scope.go:117] "RemoveContainer" containerID="69e71571599b54d94804cb296058fcbc860a6733d651057796288d432d460e8b" Jan 10 17:13:52 crc kubenswrapper[5036]: E0110 17:13:52.023230 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69e71571599b54d94804cb296058fcbc860a6733d651057796288d432d460e8b\": container with ID starting with 69e71571599b54d94804cb296058fcbc860a6733d651057796288d432d460e8b not found: ID does not exist" containerID="69e71571599b54d94804cb296058fcbc860a6733d651057796288d432d460e8b" Jan 10 17:13:52 crc kubenswrapper[5036]: I0110 17:13:52.023265 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69e71571599b54d94804cb296058fcbc860a6733d651057796288d432d460e8b"} err="failed to get container status \"69e71571599b54d94804cb296058fcbc860a6733d651057796288d432d460e8b\": rpc error: code = NotFound desc = could not find container \"69e71571599b54d94804cb296058fcbc860a6733d651057796288d432d460e8b\": container with ID starting with 69e71571599b54d94804cb296058fcbc860a6733d651057796288d432d460e8b not found: ID does not exist" Jan 10 17:13:52 crc kubenswrapper[5036]: I0110 17:13:52.023290 5036 scope.go:117] "RemoveContainer" containerID="366057db15cfaef0243e9086dc25a91072b054727e53ca13f55c45b33b919bfa" Jan 10 17:13:52 crc kubenswrapper[5036]: E0110 17:13:52.023852 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"366057db15cfaef0243e9086dc25a91072b054727e53ca13f55c45b33b919bfa\": container with ID starting with 366057db15cfaef0243e9086dc25a91072b054727e53ca13f55c45b33b919bfa not found: ID does not exist" containerID="366057db15cfaef0243e9086dc25a91072b054727e53ca13f55c45b33b919bfa" Jan 10 17:13:52 crc kubenswrapper[5036]: I0110 17:13:52.023899 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"366057db15cfaef0243e9086dc25a91072b054727e53ca13f55c45b33b919bfa"} err="failed to get container status \"366057db15cfaef0243e9086dc25a91072b054727e53ca13f55c45b33b919bfa\": rpc error: code = NotFound desc = could not find container \"366057db15cfaef0243e9086dc25a91072b054727e53ca13f55c45b33b919bfa\": container with ID starting with 366057db15cfaef0243e9086dc25a91072b054727e53ca13f55c45b33b919bfa not found: ID does not exist" Jan 10 17:13:52 crc kubenswrapper[5036]: I0110 17:13:52.023934 5036 scope.go:117] "RemoveContainer" containerID="2f7bc1839304dbe923712b38ef17b339d1268c25322dd931046f2552a77b1fd4" Jan 10 17:13:52 crc kubenswrapper[5036]: E0110 17:13:52.024443 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f7bc1839304dbe923712b38ef17b339d1268c25322dd931046f2552a77b1fd4\": container with ID starting with 2f7bc1839304dbe923712b38ef17b339d1268c25322dd931046f2552a77b1fd4 not found: ID does not exist" containerID="2f7bc1839304dbe923712b38ef17b339d1268c25322dd931046f2552a77b1fd4" Jan 10 17:13:52 crc kubenswrapper[5036]: I0110 17:13:52.024515 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f7bc1839304dbe923712b38ef17b339d1268c25322dd931046f2552a77b1fd4"} err="failed to get container status \"2f7bc1839304dbe923712b38ef17b339d1268c25322dd931046f2552a77b1fd4\": rpc error: code = NotFound desc = could not find container \"2f7bc1839304dbe923712b38ef17b339d1268c25322dd931046f2552a77b1fd4\": container with ID starting with 2f7bc1839304dbe923712b38ef17b339d1268c25322dd931046f2552a77b1fd4 not found: ID does not exist" Jan 10 17:13:52 crc kubenswrapper[5036]: I0110 17:13:52.523553 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50a2a5d5-91e9-4cdd-8864-8001311a73bd" path="/var/lib/kubelet/pods/50a2a5d5-91e9-4cdd-8864-8001311a73bd/volumes" Jan 10 17:14:01 crc kubenswrapper[5036]: I0110 17:14:01.005826 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7kmmm" Jan 10 17:14:01 crc kubenswrapper[5036]: I0110 17:14:01.085617 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7kmmm" Jan 10 17:14:01 crc kubenswrapper[5036]: I0110 17:14:01.244926 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7kmmm"] Jan 10 17:14:02 crc kubenswrapper[5036]: I0110 17:14:02.056994 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7kmmm" podUID="0a56bf21-da14-4f3b-af16-d8390bfe5384" containerName="registry-server" containerID="cri-o://8c07308ac8098768be754b654d2377d9686b5eac418dbf7e8ffe479049c24ed7" gracePeriod=2 Jan 10 17:14:02 crc kubenswrapper[5036]: I0110 17:14:02.499766 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7kmmm" Jan 10 17:14:02 crc kubenswrapper[5036]: I0110 17:14:02.620422 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a56bf21-da14-4f3b-af16-d8390bfe5384-catalog-content\") pod \"0a56bf21-da14-4f3b-af16-d8390bfe5384\" (UID: \"0a56bf21-da14-4f3b-af16-d8390bfe5384\") " Jan 10 17:14:02 crc kubenswrapper[5036]: I0110 17:14:02.620595 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65cx7\" (UniqueName: \"kubernetes.io/projected/0a56bf21-da14-4f3b-af16-d8390bfe5384-kube-api-access-65cx7\") pod \"0a56bf21-da14-4f3b-af16-d8390bfe5384\" (UID: \"0a56bf21-da14-4f3b-af16-d8390bfe5384\") " Jan 10 17:14:02 crc kubenswrapper[5036]: I0110 17:14:02.620639 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a56bf21-da14-4f3b-af16-d8390bfe5384-utilities\") pod \"0a56bf21-da14-4f3b-af16-d8390bfe5384\" (UID: \"0a56bf21-da14-4f3b-af16-d8390bfe5384\") " Jan 10 17:14:02 crc kubenswrapper[5036]: I0110 17:14:02.621294 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a56bf21-da14-4f3b-af16-d8390bfe5384-utilities" (OuterVolumeSpecName: "utilities") pod "0a56bf21-da14-4f3b-af16-d8390bfe5384" (UID: "0a56bf21-da14-4f3b-af16-d8390bfe5384"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:14:02 crc kubenswrapper[5036]: I0110 17:14:02.626863 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a56bf21-da14-4f3b-af16-d8390bfe5384-kube-api-access-65cx7" (OuterVolumeSpecName: "kube-api-access-65cx7") pod "0a56bf21-da14-4f3b-af16-d8390bfe5384" (UID: "0a56bf21-da14-4f3b-af16-d8390bfe5384"). InnerVolumeSpecName "kube-api-access-65cx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:14:02 crc kubenswrapper[5036]: I0110 17:14:02.723166 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65cx7\" (UniqueName: \"kubernetes.io/projected/0a56bf21-da14-4f3b-af16-d8390bfe5384-kube-api-access-65cx7\") on node \"crc\" DevicePath \"\"" Jan 10 17:14:02 crc kubenswrapper[5036]: I0110 17:14:02.723197 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a56bf21-da14-4f3b-af16-d8390bfe5384-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 17:14:02 crc kubenswrapper[5036]: I0110 17:14:02.742204 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a56bf21-da14-4f3b-af16-d8390bfe5384-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a56bf21-da14-4f3b-af16-d8390bfe5384" (UID: "0a56bf21-da14-4f3b-af16-d8390bfe5384"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:14:02 crc kubenswrapper[5036]: I0110 17:14:02.824750 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a56bf21-da14-4f3b-af16-d8390bfe5384-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 17:14:03 crc kubenswrapper[5036]: I0110 17:14:03.071368 5036 generic.go:334] "Generic (PLEG): container finished" podID="0a56bf21-da14-4f3b-af16-d8390bfe5384" containerID="8c07308ac8098768be754b654d2377d9686b5eac418dbf7e8ffe479049c24ed7" exitCode=0 Jan 10 17:14:03 crc kubenswrapper[5036]: I0110 17:14:03.071439 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7kmmm" event={"ID":"0a56bf21-da14-4f3b-af16-d8390bfe5384","Type":"ContainerDied","Data":"8c07308ac8098768be754b654d2377d9686b5eac418dbf7e8ffe479049c24ed7"} Jan 10 17:14:03 crc kubenswrapper[5036]: I0110 17:14:03.071457 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7kmmm" Jan 10 17:14:03 crc kubenswrapper[5036]: I0110 17:14:03.071495 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7kmmm" event={"ID":"0a56bf21-da14-4f3b-af16-d8390bfe5384","Type":"ContainerDied","Data":"4e20fbb44bd1ec3b2467e6b7617c815acef90de2365fe2110aaa06927cc03ecf"} Jan 10 17:14:03 crc kubenswrapper[5036]: I0110 17:14:03.071533 5036 scope.go:117] "RemoveContainer" containerID="8c07308ac8098768be754b654d2377d9686b5eac418dbf7e8ffe479049c24ed7" Jan 10 17:14:03 crc kubenswrapper[5036]: I0110 17:14:03.097668 5036 scope.go:117] "RemoveContainer" containerID="876065ce7d4c91a44ea40b0e3f1e2b43bdb6bd71f6c29252a62e09c8c6437ae5" Jan 10 17:14:03 crc kubenswrapper[5036]: I0110 17:14:03.125090 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7kmmm"] Jan 10 17:14:03 crc kubenswrapper[5036]: I0110 17:14:03.132884 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7kmmm"] Jan 10 17:14:03 crc kubenswrapper[5036]: I0110 17:14:03.138346 5036 scope.go:117] "RemoveContainer" containerID="ad10c9cf13abedc26028cac763cb1d599cad0c7dd17fade003293df98e452038" Jan 10 17:14:03 crc kubenswrapper[5036]: I0110 17:14:03.185537 5036 scope.go:117] "RemoveContainer" containerID="8c07308ac8098768be754b654d2377d9686b5eac418dbf7e8ffe479049c24ed7" Jan 10 17:14:03 crc kubenswrapper[5036]: E0110 17:14:03.185987 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c07308ac8098768be754b654d2377d9686b5eac418dbf7e8ffe479049c24ed7\": container with ID starting with 8c07308ac8098768be754b654d2377d9686b5eac418dbf7e8ffe479049c24ed7 not found: ID does not exist" containerID="8c07308ac8098768be754b654d2377d9686b5eac418dbf7e8ffe479049c24ed7" Jan 10 17:14:03 crc kubenswrapper[5036]: I0110 17:14:03.186036 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c07308ac8098768be754b654d2377d9686b5eac418dbf7e8ffe479049c24ed7"} err="failed to get container status \"8c07308ac8098768be754b654d2377d9686b5eac418dbf7e8ffe479049c24ed7\": rpc error: code = NotFound desc = could not find container \"8c07308ac8098768be754b654d2377d9686b5eac418dbf7e8ffe479049c24ed7\": container with ID starting with 8c07308ac8098768be754b654d2377d9686b5eac418dbf7e8ffe479049c24ed7 not found: ID does not exist" Jan 10 17:14:03 crc kubenswrapper[5036]: I0110 17:14:03.186066 5036 scope.go:117] "RemoveContainer" containerID="876065ce7d4c91a44ea40b0e3f1e2b43bdb6bd71f6c29252a62e09c8c6437ae5" Jan 10 17:14:03 crc kubenswrapper[5036]: E0110 17:14:03.186403 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"876065ce7d4c91a44ea40b0e3f1e2b43bdb6bd71f6c29252a62e09c8c6437ae5\": container with ID starting with 876065ce7d4c91a44ea40b0e3f1e2b43bdb6bd71f6c29252a62e09c8c6437ae5 not found: ID does not exist" containerID="876065ce7d4c91a44ea40b0e3f1e2b43bdb6bd71f6c29252a62e09c8c6437ae5" Jan 10 17:14:03 crc kubenswrapper[5036]: I0110 17:14:03.186455 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"876065ce7d4c91a44ea40b0e3f1e2b43bdb6bd71f6c29252a62e09c8c6437ae5"} err="failed to get container status \"876065ce7d4c91a44ea40b0e3f1e2b43bdb6bd71f6c29252a62e09c8c6437ae5\": rpc error: code = NotFound desc = could not find container \"876065ce7d4c91a44ea40b0e3f1e2b43bdb6bd71f6c29252a62e09c8c6437ae5\": container with ID starting with 876065ce7d4c91a44ea40b0e3f1e2b43bdb6bd71f6c29252a62e09c8c6437ae5 not found: ID does not exist" Jan 10 17:14:03 crc kubenswrapper[5036]: I0110 17:14:03.186497 5036 scope.go:117] "RemoveContainer" containerID="ad10c9cf13abedc26028cac763cb1d599cad0c7dd17fade003293df98e452038" Jan 10 17:14:03 crc kubenswrapper[5036]: E0110 17:14:03.187096 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad10c9cf13abedc26028cac763cb1d599cad0c7dd17fade003293df98e452038\": container with ID starting with ad10c9cf13abedc26028cac763cb1d599cad0c7dd17fade003293df98e452038 not found: ID does not exist" containerID="ad10c9cf13abedc26028cac763cb1d599cad0c7dd17fade003293df98e452038" Jan 10 17:14:03 crc kubenswrapper[5036]: I0110 17:14:03.187122 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad10c9cf13abedc26028cac763cb1d599cad0c7dd17fade003293df98e452038"} err="failed to get container status \"ad10c9cf13abedc26028cac763cb1d599cad0c7dd17fade003293df98e452038\": rpc error: code = NotFound desc = could not find container \"ad10c9cf13abedc26028cac763cb1d599cad0c7dd17fade003293df98e452038\": container with ID starting with ad10c9cf13abedc26028cac763cb1d599cad0c7dd17fade003293df98e452038 not found: ID does not exist" Jan 10 17:14:04 crc kubenswrapper[5036]: I0110 17:14:04.519968 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a56bf21-da14-4f3b-af16-d8390bfe5384" path="/var/lib/kubelet/pods/0a56bf21-da14-4f3b-af16-d8390bfe5384/volumes" Jan 10 17:14:26 crc kubenswrapper[5036]: I0110 17:14:26.325172 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2m797"] Jan 10 17:14:26 crc kubenswrapper[5036]: E0110 17:14:26.325969 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a2a5d5-91e9-4cdd-8864-8001311a73bd" containerName="extract-content" Jan 10 17:14:26 crc kubenswrapper[5036]: I0110 17:14:26.325982 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a2a5d5-91e9-4cdd-8864-8001311a73bd" containerName="extract-content" Jan 10 17:14:26 crc kubenswrapper[5036]: E0110 17:14:26.325997 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a56bf21-da14-4f3b-af16-d8390bfe5384" containerName="extract-content" Jan 10 17:14:26 crc kubenswrapper[5036]: I0110 17:14:26.326003 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a56bf21-da14-4f3b-af16-d8390bfe5384" containerName="extract-content" Jan 10 17:14:26 crc kubenswrapper[5036]: E0110 17:14:26.326010 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a56bf21-da14-4f3b-af16-d8390bfe5384" containerName="extract-utilities" Jan 10 17:14:26 crc kubenswrapper[5036]: I0110 17:14:26.326018 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a56bf21-da14-4f3b-af16-d8390bfe5384" containerName="extract-utilities" Jan 10 17:14:26 crc kubenswrapper[5036]: E0110 17:14:26.326030 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a2a5d5-91e9-4cdd-8864-8001311a73bd" containerName="extract-utilities" Jan 10 17:14:26 crc kubenswrapper[5036]: I0110 17:14:26.326036 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a2a5d5-91e9-4cdd-8864-8001311a73bd" containerName="extract-utilities" Jan 10 17:14:26 crc kubenswrapper[5036]: E0110 17:14:26.326044 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a2a5d5-91e9-4cdd-8864-8001311a73bd" containerName="registry-server" Jan 10 17:14:26 crc kubenswrapper[5036]: I0110 17:14:26.326051 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a2a5d5-91e9-4cdd-8864-8001311a73bd" containerName="registry-server" Jan 10 17:14:26 crc kubenswrapper[5036]: E0110 17:14:26.326067 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a56bf21-da14-4f3b-af16-d8390bfe5384" containerName="registry-server" Jan 10 17:14:26 crc kubenswrapper[5036]: I0110 17:14:26.326073 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a56bf21-da14-4f3b-af16-d8390bfe5384" containerName="registry-server" Jan 10 17:14:26 crc kubenswrapper[5036]: I0110 17:14:26.326230 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="50a2a5d5-91e9-4cdd-8864-8001311a73bd" containerName="registry-server" Jan 10 17:14:26 crc kubenswrapper[5036]: I0110 17:14:26.326242 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a56bf21-da14-4f3b-af16-d8390bfe5384" containerName="registry-server" Jan 10 17:14:26 crc kubenswrapper[5036]: I0110 17:14:26.327488 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2m797" Jan 10 17:14:26 crc kubenswrapper[5036]: I0110 17:14:26.337914 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2m797"] Jan 10 17:14:26 crc kubenswrapper[5036]: I0110 17:14:26.500083 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01ed7e25-ae1a-4355-8e49-eab4921439ab-utilities\") pod \"community-operators-2m797\" (UID: \"01ed7e25-ae1a-4355-8e49-eab4921439ab\") " pod="openshift-marketplace/community-operators-2m797" Jan 10 17:14:26 crc kubenswrapper[5036]: I0110 17:14:26.500147 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01ed7e25-ae1a-4355-8e49-eab4921439ab-catalog-content\") pod \"community-operators-2m797\" (UID: \"01ed7e25-ae1a-4355-8e49-eab4921439ab\") " pod="openshift-marketplace/community-operators-2m797" Jan 10 17:14:26 crc kubenswrapper[5036]: I0110 17:14:26.500183 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndb5j\" (UniqueName: \"kubernetes.io/projected/01ed7e25-ae1a-4355-8e49-eab4921439ab-kube-api-access-ndb5j\") pod \"community-operators-2m797\" (UID: \"01ed7e25-ae1a-4355-8e49-eab4921439ab\") " pod="openshift-marketplace/community-operators-2m797" Jan 10 17:14:26 crc kubenswrapper[5036]: I0110 17:14:26.602186 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01ed7e25-ae1a-4355-8e49-eab4921439ab-utilities\") pod \"community-operators-2m797\" (UID: \"01ed7e25-ae1a-4355-8e49-eab4921439ab\") " pod="openshift-marketplace/community-operators-2m797" Jan 10 17:14:26 crc kubenswrapper[5036]: I0110 17:14:26.602252 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01ed7e25-ae1a-4355-8e49-eab4921439ab-catalog-content\") pod \"community-operators-2m797\" (UID: \"01ed7e25-ae1a-4355-8e49-eab4921439ab\") " pod="openshift-marketplace/community-operators-2m797" Jan 10 17:14:26 crc kubenswrapper[5036]: I0110 17:14:26.602278 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndb5j\" (UniqueName: \"kubernetes.io/projected/01ed7e25-ae1a-4355-8e49-eab4921439ab-kube-api-access-ndb5j\") pod \"community-operators-2m797\" (UID: \"01ed7e25-ae1a-4355-8e49-eab4921439ab\") " pod="openshift-marketplace/community-operators-2m797" Jan 10 17:14:26 crc kubenswrapper[5036]: I0110 17:14:26.603011 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01ed7e25-ae1a-4355-8e49-eab4921439ab-utilities\") pod \"community-operators-2m797\" (UID: \"01ed7e25-ae1a-4355-8e49-eab4921439ab\") " pod="openshift-marketplace/community-operators-2m797" Jan 10 17:14:26 crc kubenswrapper[5036]: I0110 17:14:26.603229 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01ed7e25-ae1a-4355-8e49-eab4921439ab-catalog-content\") pod \"community-operators-2m797\" (UID: \"01ed7e25-ae1a-4355-8e49-eab4921439ab\") " pod="openshift-marketplace/community-operators-2m797" Jan 10 17:14:26 crc kubenswrapper[5036]: I0110 17:14:26.628727 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndb5j\" (UniqueName: \"kubernetes.io/projected/01ed7e25-ae1a-4355-8e49-eab4921439ab-kube-api-access-ndb5j\") pod \"community-operators-2m797\" (UID: \"01ed7e25-ae1a-4355-8e49-eab4921439ab\") " pod="openshift-marketplace/community-operators-2m797" Jan 10 17:14:26 crc kubenswrapper[5036]: I0110 17:14:26.667247 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2m797" Jan 10 17:14:26 crc kubenswrapper[5036]: I0110 17:14:26.965107 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2m797"] Jan 10 17:14:27 crc kubenswrapper[5036]: I0110 17:14:27.324296 5036 generic.go:334] "Generic (PLEG): container finished" podID="01ed7e25-ae1a-4355-8e49-eab4921439ab" containerID="3f9f81143ebdd941bfbc89d98551f566423a8dda061f418f06a927613e90a8ed" exitCode=0 Jan 10 17:14:27 crc kubenswrapper[5036]: I0110 17:14:27.324365 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2m797" event={"ID":"01ed7e25-ae1a-4355-8e49-eab4921439ab","Type":"ContainerDied","Data":"3f9f81143ebdd941bfbc89d98551f566423a8dda061f418f06a927613e90a8ed"} Jan 10 17:14:27 crc kubenswrapper[5036]: I0110 17:14:27.324581 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2m797" event={"ID":"01ed7e25-ae1a-4355-8e49-eab4921439ab","Type":"ContainerStarted","Data":"6fe1c3304eb89d9f32def27f232053a3f72836f6a4dceac7f9bf91684f682888"} Jan 10 17:14:28 crc kubenswrapper[5036]: I0110 17:14:28.338825 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2m797" event={"ID":"01ed7e25-ae1a-4355-8e49-eab4921439ab","Type":"ContainerStarted","Data":"b276d2ede1923b6e9cbfe348cf0f55351c271983f9042f22c5f169ecd9bf365d"} Jan 10 17:14:29 crc kubenswrapper[5036]: I0110 17:14:29.349945 5036 generic.go:334] "Generic (PLEG): container finished" podID="01ed7e25-ae1a-4355-8e49-eab4921439ab" containerID="b276d2ede1923b6e9cbfe348cf0f55351c271983f9042f22c5f169ecd9bf365d" exitCode=0 Jan 10 17:14:29 crc kubenswrapper[5036]: I0110 17:14:29.349999 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2m797" event={"ID":"01ed7e25-ae1a-4355-8e49-eab4921439ab","Type":"ContainerDied","Data":"b276d2ede1923b6e9cbfe348cf0f55351c271983f9042f22c5f169ecd9bf365d"} Jan 10 17:14:30 crc kubenswrapper[5036]: I0110 17:14:30.364361 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2m797" event={"ID":"01ed7e25-ae1a-4355-8e49-eab4921439ab","Type":"ContainerStarted","Data":"7ade87aa9f047719455af47a5a0cf3cb15e6bc849ddd8a1c0b97a6c3b58a77e5"} Jan 10 17:14:30 crc kubenswrapper[5036]: I0110 17:14:30.382236 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2m797" podStartSLOduration=1.752272505 podStartE2EDuration="4.382220837s" podCreationTimestamp="2026-01-10 17:14:26 +0000 UTC" firstStartedPulling="2026-01-10 17:14:27.326402484 +0000 UTC m=+2789.196637978" lastFinishedPulling="2026-01-10 17:14:29.956350816 +0000 UTC m=+2791.826586310" observedRunningTime="2026-01-10 17:14:30.378388418 +0000 UTC m=+2792.248623912" watchObservedRunningTime="2026-01-10 17:14:30.382220837 +0000 UTC m=+2792.252456321" Jan 10 17:14:36 crc kubenswrapper[5036]: I0110 17:14:36.667348 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2m797" Jan 10 17:14:36 crc kubenswrapper[5036]: I0110 17:14:36.667892 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2m797" Jan 10 17:14:36 crc kubenswrapper[5036]: I0110 17:14:36.729968 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2m797" Jan 10 17:14:37 crc kubenswrapper[5036]: I0110 17:14:37.500342 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2m797" Jan 10 17:14:37 crc kubenswrapper[5036]: I0110 17:14:37.548337 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2m797"] Jan 10 17:14:39 crc kubenswrapper[5036]: I0110 17:14:39.447310 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2m797" podUID="01ed7e25-ae1a-4355-8e49-eab4921439ab" containerName="registry-server" containerID="cri-o://7ade87aa9f047719455af47a5a0cf3cb15e6bc849ddd8a1c0b97a6c3b58a77e5" gracePeriod=2 Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.444009 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2m797" Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.457666 5036 generic.go:334] "Generic (PLEG): container finished" podID="01ed7e25-ae1a-4355-8e49-eab4921439ab" containerID="7ade87aa9f047719455af47a5a0cf3cb15e6bc849ddd8a1c0b97a6c3b58a77e5" exitCode=0 Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.457721 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2m797" event={"ID":"01ed7e25-ae1a-4355-8e49-eab4921439ab","Type":"ContainerDied","Data":"7ade87aa9f047719455af47a5a0cf3cb15e6bc849ddd8a1c0b97a6c3b58a77e5"} Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.457751 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2m797" Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.457776 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2m797" event={"ID":"01ed7e25-ae1a-4355-8e49-eab4921439ab","Type":"ContainerDied","Data":"6fe1c3304eb89d9f32def27f232053a3f72836f6a4dceac7f9bf91684f682888"} Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.457804 5036 scope.go:117] "RemoveContainer" containerID="7ade87aa9f047719455af47a5a0cf3cb15e6bc849ddd8a1c0b97a6c3b58a77e5" Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.480357 5036 scope.go:117] "RemoveContainer" containerID="b276d2ede1923b6e9cbfe348cf0f55351c271983f9042f22c5f169ecd9bf365d" Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.511908 5036 scope.go:117] "RemoveContainer" containerID="3f9f81143ebdd941bfbc89d98551f566423a8dda061f418f06a927613e90a8ed" Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.564242 5036 scope.go:117] "RemoveContainer" containerID="7ade87aa9f047719455af47a5a0cf3cb15e6bc849ddd8a1c0b97a6c3b58a77e5" Jan 10 17:14:40 crc kubenswrapper[5036]: E0110 17:14:40.564836 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ade87aa9f047719455af47a5a0cf3cb15e6bc849ddd8a1c0b97a6c3b58a77e5\": container with ID starting with 7ade87aa9f047719455af47a5a0cf3cb15e6bc849ddd8a1c0b97a6c3b58a77e5 not found: ID does not exist" containerID="7ade87aa9f047719455af47a5a0cf3cb15e6bc849ddd8a1c0b97a6c3b58a77e5" Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.564868 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ade87aa9f047719455af47a5a0cf3cb15e6bc849ddd8a1c0b97a6c3b58a77e5"} err="failed to get container status \"7ade87aa9f047719455af47a5a0cf3cb15e6bc849ddd8a1c0b97a6c3b58a77e5\": rpc error: code = NotFound desc = could not find container \"7ade87aa9f047719455af47a5a0cf3cb15e6bc849ddd8a1c0b97a6c3b58a77e5\": container with ID starting with 7ade87aa9f047719455af47a5a0cf3cb15e6bc849ddd8a1c0b97a6c3b58a77e5 not found: ID does not exist" Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.564889 5036 scope.go:117] "RemoveContainer" containerID="b276d2ede1923b6e9cbfe348cf0f55351c271983f9042f22c5f169ecd9bf365d" Jan 10 17:14:40 crc kubenswrapper[5036]: E0110 17:14:40.565482 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b276d2ede1923b6e9cbfe348cf0f55351c271983f9042f22c5f169ecd9bf365d\": container with ID starting with b276d2ede1923b6e9cbfe348cf0f55351c271983f9042f22c5f169ecd9bf365d not found: ID does not exist" containerID="b276d2ede1923b6e9cbfe348cf0f55351c271983f9042f22c5f169ecd9bf365d" Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.565544 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b276d2ede1923b6e9cbfe348cf0f55351c271983f9042f22c5f169ecd9bf365d"} err="failed to get container status \"b276d2ede1923b6e9cbfe348cf0f55351c271983f9042f22c5f169ecd9bf365d\": rpc error: code = NotFound desc = could not find container \"b276d2ede1923b6e9cbfe348cf0f55351c271983f9042f22c5f169ecd9bf365d\": container with ID starting with b276d2ede1923b6e9cbfe348cf0f55351c271983f9042f22c5f169ecd9bf365d not found: ID does not exist" Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.565584 5036 scope.go:117] "RemoveContainer" containerID="3f9f81143ebdd941bfbc89d98551f566423a8dda061f418f06a927613e90a8ed" Jan 10 17:14:40 crc kubenswrapper[5036]: E0110 17:14:40.566137 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f9f81143ebdd941bfbc89d98551f566423a8dda061f418f06a927613e90a8ed\": container with ID starting with 3f9f81143ebdd941bfbc89d98551f566423a8dda061f418f06a927613e90a8ed not found: ID does not exist" containerID="3f9f81143ebdd941bfbc89d98551f566423a8dda061f418f06a927613e90a8ed" Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.566169 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f9f81143ebdd941bfbc89d98551f566423a8dda061f418f06a927613e90a8ed"} err="failed to get container status \"3f9f81143ebdd941bfbc89d98551f566423a8dda061f418f06a927613e90a8ed\": rpc error: code = NotFound desc = could not find container \"3f9f81143ebdd941bfbc89d98551f566423a8dda061f418f06a927613e90a8ed\": container with ID starting with 3f9f81143ebdd941bfbc89d98551f566423a8dda061f418f06a927613e90a8ed not found: ID does not exist" Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.591119 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01ed7e25-ae1a-4355-8e49-eab4921439ab-utilities\") pod \"01ed7e25-ae1a-4355-8e49-eab4921439ab\" (UID: \"01ed7e25-ae1a-4355-8e49-eab4921439ab\") " Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.591361 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndb5j\" (UniqueName: \"kubernetes.io/projected/01ed7e25-ae1a-4355-8e49-eab4921439ab-kube-api-access-ndb5j\") pod \"01ed7e25-ae1a-4355-8e49-eab4921439ab\" (UID: \"01ed7e25-ae1a-4355-8e49-eab4921439ab\") " Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.591447 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01ed7e25-ae1a-4355-8e49-eab4921439ab-catalog-content\") pod \"01ed7e25-ae1a-4355-8e49-eab4921439ab\" (UID: \"01ed7e25-ae1a-4355-8e49-eab4921439ab\") " Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.592770 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01ed7e25-ae1a-4355-8e49-eab4921439ab-utilities" (OuterVolumeSpecName: "utilities") pod "01ed7e25-ae1a-4355-8e49-eab4921439ab" (UID: "01ed7e25-ae1a-4355-8e49-eab4921439ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.600638 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ed7e25-ae1a-4355-8e49-eab4921439ab-kube-api-access-ndb5j" (OuterVolumeSpecName: "kube-api-access-ndb5j") pod "01ed7e25-ae1a-4355-8e49-eab4921439ab" (UID: "01ed7e25-ae1a-4355-8e49-eab4921439ab"). InnerVolumeSpecName "kube-api-access-ndb5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.664364 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01ed7e25-ae1a-4355-8e49-eab4921439ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01ed7e25-ae1a-4355-8e49-eab4921439ab" (UID: "01ed7e25-ae1a-4355-8e49-eab4921439ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.694017 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01ed7e25-ae1a-4355-8e49-eab4921439ab-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.694048 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndb5j\" (UniqueName: \"kubernetes.io/projected/01ed7e25-ae1a-4355-8e49-eab4921439ab-kube-api-access-ndb5j\") on node \"crc\" DevicePath \"\"" Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.694058 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01ed7e25-ae1a-4355-8e49-eab4921439ab-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.789583 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2m797"] Jan 10 17:14:40 crc kubenswrapper[5036]: I0110 17:14:40.798656 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2m797"] Jan 10 17:14:42 crc kubenswrapper[5036]: I0110 17:14:42.524308 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ed7e25-ae1a-4355-8e49-eab4921439ab" path="/var/lib/kubelet/pods/01ed7e25-ae1a-4355-8e49-eab4921439ab/volumes" Jan 10 17:15:00 crc kubenswrapper[5036]: I0110 17:15:00.168730 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29467755-fkm7n"] Jan 10 17:15:00 crc kubenswrapper[5036]: E0110 17:15:00.170189 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ed7e25-ae1a-4355-8e49-eab4921439ab" containerName="extract-content" Jan 10 17:15:00 crc kubenswrapper[5036]: I0110 17:15:00.170212 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ed7e25-ae1a-4355-8e49-eab4921439ab" containerName="extract-content" Jan 10 17:15:00 crc kubenswrapper[5036]: E0110 17:15:00.170243 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ed7e25-ae1a-4355-8e49-eab4921439ab" containerName="registry-server" Jan 10 17:15:00 crc kubenswrapper[5036]: I0110 17:15:00.170254 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ed7e25-ae1a-4355-8e49-eab4921439ab" containerName="registry-server" Jan 10 17:15:00 crc kubenswrapper[5036]: E0110 17:15:00.170290 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ed7e25-ae1a-4355-8e49-eab4921439ab" containerName="extract-utilities" Jan 10 17:15:00 crc kubenswrapper[5036]: I0110 17:15:00.170301 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ed7e25-ae1a-4355-8e49-eab4921439ab" containerName="extract-utilities" Jan 10 17:15:00 crc kubenswrapper[5036]: I0110 17:15:00.170613 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="01ed7e25-ae1a-4355-8e49-eab4921439ab" containerName="registry-server" Jan 10 17:15:00 crc kubenswrapper[5036]: I0110 17:15:00.171498 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29467755-fkm7n" Jan 10 17:15:00 crc kubenswrapper[5036]: I0110 17:15:00.174755 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 10 17:15:00 crc kubenswrapper[5036]: I0110 17:15:00.175113 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 10 17:15:00 crc kubenswrapper[5036]: I0110 17:15:00.197825 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29467755-fkm7n"] Jan 10 17:15:00 crc kubenswrapper[5036]: I0110 17:15:00.243665 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl2w5\" (UniqueName: \"kubernetes.io/projected/d0734fab-42f0-4005-a68b-82f939454d73-kube-api-access-pl2w5\") pod \"collect-profiles-29467755-fkm7n\" (UID: \"d0734fab-42f0-4005-a68b-82f939454d73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467755-fkm7n" Jan 10 17:15:00 crc kubenswrapper[5036]: I0110 17:15:00.243904 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d0734fab-42f0-4005-a68b-82f939454d73-secret-volume\") pod \"collect-profiles-29467755-fkm7n\" (UID: \"d0734fab-42f0-4005-a68b-82f939454d73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467755-fkm7n" Jan 10 17:15:00 crc kubenswrapper[5036]: I0110 17:15:00.244013 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0734fab-42f0-4005-a68b-82f939454d73-config-volume\") pod \"collect-profiles-29467755-fkm7n\" (UID: \"d0734fab-42f0-4005-a68b-82f939454d73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467755-fkm7n" Jan 10 17:15:00 crc kubenswrapper[5036]: I0110 17:15:00.346142 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0734fab-42f0-4005-a68b-82f939454d73-config-volume\") pod \"collect-profiles-29467755-fkm7n\" (UID: \"d0734fab-42f0-4005-a68b-82f939454d73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467755-fkm7n" Jan 10 17:15:00 crc kubenswrapper[5036]: I0110 17:15:00.346645 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl2w5\" (UniqueName: \"kubernetes.io/projected/d0734fab-42f0-4005-a68b-82f939454d73-kube-api-access-pl2w5\") pod \"collect-profiles-29467755-fkm7n\" (UID: \"d0734fab-42f0-4005-a68b-82f939454d73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467755-fkm7n" Jan 10 17:15:00 crc kubenswrapper[5036]: I0110 17:15:00.346882 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d0734fab-42f0-4005-a68b-82f939454d73-secret-volume\") pod \"collect-profiles-29467755-fkm7n\" (UID: \"d0734fab-42f0-4005-a68b-82f939454d73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467755-fkm7n" Jan 10 17:15:00 crc kubenswrapper[5036]: I0110 17:15:00.347177 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0734fab-42f0-4005-a68b-82f939454d73-config-volume\") pod \"collect-profiles-29467755-fkm7n\" (UID: \"d0734fab-42f0-4005-a68b-82f939454d73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467755-fkm7n" Jan 10 17:15:00 crc kubenswrapper[5036]: I0110 17:15:00.361941 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d0734fab-42f0-4005-a68b-82f939454d73-secret-volume\") pod \"collect-profiles-29467755-fkm7n\" (UID: \"d0734fab-42f0-4005-a68b-82f939454d73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467755-fkm7n" Jan 10 17:15:00 crc kubenswrapper[5036]: I0110 17:15:00.362949 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl2w5\" (UniqueName: \"kubernetes.io/projected/d0734fab-42f0-4005-a68b-82f939454d73-kube-api-access-pl2w5\") pod \"collect-profiles-29467755-fkm7n\" (UID: \"d0734fab-42f0-4005-a68b-82f939454d73\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467755-fkm7n" Jan 10 17:15:00 crc kubenswrapper[5036]: I0110 17:15:00.526334 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29467755-fkm7n" Jan 10 17:15:00 crc kubenswrapper[5036]: I0110 17:15:00.980480 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29467755-fkm7n"] Jan 10 17:15:01 crc kubenswrapper[5036]: I0110 17:15:01.689246 5036 generic.go:334] "Generic (PLEG): container finished" podID="d0734fab-42f0-4005-a68b-82f939454d73" containerID="031d324afdd34a9b9690f61cee51a4327298b30f779d9980b4b4ddbabc2077af" exitCode=0 Jan 10 17:15:01 crc kubenswrapper[5036]: I0110 17:15:01.689502 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29467755-fkm7n" event={"ID":"d0734fab-42f0-4005-a68b-82f939454d73","Type":"ContainerDied","Data":"031d324afdd34a9b9690f61cee51a4327298b30f779d9980b4b4ddbabc2077af"} Jan 10 17:15:01 crc kubenswrapper[5036]: I0110 17:15:01.689576 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29467755-fkm7n" event={"ID":"d0734fab-42f0-4005-a68b-82f939454d73","Type":"ContainerStarted","Data":"c21bf0014ae1c1164610f06c68e842c50628c5dafa7e58224ca4106ff20b2c38"} Jan 10 17:15:03 crc kubenswrapper[5036]: I0110 17:15:03.016417 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29467755-fkm7n" Jan 10 17:15:03 crc kubenswrapper[5036]: I0110 17:15:03.098827 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d0734fab-42f0-4005-a68b-82f939454d73-secret-volume\") pod \"d0734fab-42f0-4005-a68b-82f939454d73\" (UID: \"d0734fab-42f0-4005-a68b-82f939454d73\") " Jan 10 17:15:03 crc kubenswrapper[5036]: I0110 17:15:03.099122 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0734fab-42f0-4005-a68b-82f939454d73-config-volume\") pod \"d0734fab-42f0-4005-a68b-82f939454d73\" (UID: \"d0734fab-42f0-4005-a68b-82f939454d73\") " Jan 10 17:15:03 crc kubenswrapper[5036]: I0110 17:15:03.099227 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl2w5\" (UniqueName: \"kubernetes.io/projected/d0734fab-42f0-4005-a68b-82f939454d73-kube-api-access-pl2w5\") pod \"d0734fab-42f0-4005-a68b-82f939454d73\" (UID: \"d0734fab-42f0-4005-a68b-82f939454d73\") " Jan 10 17:15:03 crc kubenswrapper[5036]: I0110 17:15:03.099845 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0734fab-42f0-4005-a68b-82f939454d73-config-volume" (OuterVolumeSpecName: "config-volume") pod "d0734fab-42f0-4005-a68b-82f939454d73" (UID: "d0734fab-42f0-4005-a68b-82f939454d73"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 17:15:03 crc kubenswrapper[5036]: I0110 17:15:03.106936 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0734fab-42f0-4005-a68b-82f939454d73-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d0734fab-42f0-4005-a68b-82f939454d73" (UID: "d0734fab-42f0-4005-a68b-82f939454d73"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:15:03 crc kubenswrapper[5036]: I0110 17:15:03.107068 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0734fab-42f0-4005-a68b-82f939454d73-kube-api-access-pl2w5" (OuterVolumeSpecName: "kube-api-access-pl2w5") pod "d0734fab-42f0-4005-a68b-82f939454d73" (UID: "d0734fab-42f0-4005-a68b-82f939454d73"). InnerVolumeSpecName "kube-api-access-pl2w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:15:03 crc kubenswrapper[5036]: I0110 17:15:03.201375 5036 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0734fab-42f0-4005-a68b-82f939454d73-config-volume\") on node \"crc\" DevicePath \"\"" Jan 10 17:15:03 crc kubenswrapper[5036]: I0110 17:15:03.201587 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl2w5\" (UniqueName: \"kubernetes.io/projected/d0734fab-42f0-4005-a68b-82f939454d73-kube-api-access-pl2w5\") on node \"crc\" DevicePath \"\"" Jan 10 17:15:03 crc kubenswrapper[5036]: I0110 17:15:03.201708 5036 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d0734fab-42f0-4005-a68b-82f939454d73-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 10 17:15:03 crc kubenswrapper[5036]: I0110 17:15:03.709012 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29467755-fkm7n" event={"ID":"d0734fab-42f0-4005-a68b-82f939454d73","Type":"ContainerDied","Data":"c21bf0014ae1c1164610f06c68e842c50628c5dafa7e58224ca4106ff20b2c38"} Jan 10 17:15:03 crc kubenswrapper[5036]: I0110 17:15:03.709050 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c21bf0014ae1c1164610f06c68e842c50628c5dafa7e58224ca4106ff20b2c38" Jan 10 17:15:03 crc kubenswrapper[5036]: I0110 17:15:03.709355 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29467755-fkm7n" Jan 10 17:15:04 crc kubenswrapper[5036]: I0110 17:15:04.118570 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29467710-wr8z2"] Jan 10 17:15:04 crc kubenswrapper[5036]: I0110 17:15:04.132860 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29467710-wr8z2"] Jan 10 17:15:04 crc kubenswrapper[5036]: I0110 17:15:04.533252 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffa416ec-eaf9-430f-9ada-2b4dd73c76ca" path="/var/lib/kubelet/pods/ffa416ec-eaf9-430f-9ada-2b4dd73c76ca/volumes" Jan 10 17:15:23 crc kubenswrapper[5036]: I0110 17:15:23.693942 5036 scope.go:117] "RemoveContainer" containerID="9437bee23ebc26628c7c421d7b8e6a3d87fef287bffc247d34d41ac077c8d3e2" Jan 10 17:15:25 crc kubenswrapper[5036]: I0110 17:15:25.904984 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 17:15:25 crc kubenswrapper[5036]: I0110 17:15:25.905876 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 17:15:48 crc kubenswrapper[5036]: I0110 17:15:48.176181 5036 generic.go:334] "Generic (PLEG): container finished" podID="b4da8068-8e5a-4624-b65f-05da63640d19" containerID="8d94edafb3070d8cbff75daa7bb654cb8dc219fba1f8a3eb94a024d894c59e59" exitCode=0 Jan 10 17:15:48 crc kubenswrapper[5036]: I0110 17:15:48.176240 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" event={"ID":"b4da8068-8e5a-4624-b65f-05da63640d19","Type":"ContainerDied","Data":"8d94edafb3070d8cbff75daa7bb654cb8dc219fba1f8a3eb94a024d894c59e59"} Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.655197 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.715160 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-ssh-key-openstack-edpm-ipam\") pod \"b4da8068-8e5a-4624-b65f-05da63640d19\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.715267 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b4da8068-8e5a-4624-b65f-05da63640d19-nova-extra-config-0\") pod \"b4da8068-8e5a-4624-b65f-05da63640d19\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.715295 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-migration-ssh-key-0\") pod \"b4da8068-8e5a-4624-b65f-05da63640d19\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.715450 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-migration-ssh-key-1\") pod \"b4da8068-8e5a-4624-b65f-05da63640d19\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.715472 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-custom-ceph-combined-ca-bundle\") pod \"b4da8068-8e5a-4624-b65f-05da63640d19\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.715521 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9wwk\" (UniqueName: \"kubernetes.io/projected/b4da8068-8e5a-4624-b65f-05da63640d19-kube-api-access-q9wwk\") pod \"b4da8068-8e5a-4624-b65f-05da63640d19\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.715549 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-cell1-compute-config-0\") pod \"b4da8068-8e5a-4624-b65f-05da63640d19\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.715581 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-ceph\") pod \"b4da8068-8e5a-4624-b65f-05da63640d19\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.715615 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-cell1-compute-config-1\") pod \"b4da8068-8e5a-4624-b65f-05da63640d19\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.715639 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-inventory\") pod \"b4da8068-8e5a-4624-b65f-05da63640d19\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.715662 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b4da8068-8e5a-4624-b65f-05da63640d19-ceph-nova-0\") pod \"b4da8068-8e5a-4624-b65f-05da63640d19\" (UID: \"b4da8068-8e5a-4624-b65f-05da63640d19\") " Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.724208 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4da8068-8e5a-4624-b65f-05da63640d19-kube-api-access-q9wwk" (OuterVolumeSpecName: "kube-api-access-q9wwk") pod "b4da8068-8e5a-4624-b65f-05da63640d19" (UID: "b4da8068-8e5a-4624-b65f-05da63640d19"). InnerVolumeSpecName "kube-api-access-q9wwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.724864 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-ceph" (OuterVolumeSpecName: "ceph") pod "b4da8068-8e5a-4624-b65f-05da63640d19" (UID: "b4da8068-8e5a-4624-b65f-05da63640d19"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.737893 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "b4da8068-8e5a-4624-b65f-05da63640d19" (UID: "b4da8068-8e5a-4624-b65f-05da63640d19"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.748650 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4da8068-8e5a-4624-b65f-05da63640d19-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "b4da8068-8e5a-4624-b65f-05da63640d19" (UID: "b4da8068-8e5a-4624-b65f-05da63640d19"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.761178 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "b4da8068-8e5a-4624-b65f-05da63640d19" (UID: "b4da8068-8e5a-4624-b65f-05da63640d19"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.764017 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-inventory" (OuterVolumeSpecName: "inventory") pod "b4da8068-8e5a-4624-b65f-05da63640d19" (UID: "b4da8068-8e5a-4624-b65f-05da63640d19"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.769472 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "b4da8068-8e5a-4624-b65f-05da63640d19" (UID: "b4da8068-8e5a-4624-b65f-05da63640d19"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.770331 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "b4da8068-8e5a-4624-b65f-05da63640d19" (UID: "b4da8068-8e5a-4624-b65f-05da63640d19"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.772355 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b4da8068-8e5a-4624-b65f-05da63640d19" (UID: "b4da8068-8e5a-4624-b65f-05da63640d19"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.773205 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4da8068-8e5a-4624-b65f-05da63640d19-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "b4da8068-8e5a-4624-b65f-05da63640d19" (UID: "b4da8068-8e5a-4624-b65f-05da63640d19"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.775050 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "b4da8068-8e5a-4624-b65f-05da63640d19" (UID: "b4da8068-8e5a-4624-b65f-05da63640d19"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.817367 5036 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.817419 5036 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.817440 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9wwk\" (UniqueName: \"kubernetes.io/projected/b4da8068-8e5a-4624-b65f-05da63640d19-kube-api-access-q9wwk\") on node \"crc\" DevicePath \"\"" Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.817454 5036 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.817466 5036 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-ceph\") on node \"crc\" DevicePath \"\"" Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.817478 5036 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.817490 5036 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-inventory\") on node \"crc\" DevicePath \"\"" Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.817501 5036 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/b4da8068-8e5a-4624-b65f-05da63640d19-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.817512 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.817523 5036 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/b4da8068-8e5a-4624-b65f-05da63640d19-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Jan 10 17:15:49 crc kubenswrapper[5036]: I0110 17:15:49.817535 5036 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b4da8068-8e5a-4624-b65f-05da63640d19-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 10 17:15:50 crc kubenswrapper[5036]: I0110 17:15:50.201399 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" event={"ID":"b4da8068-8e5a-4624-b65f-05da63640d19","Type":"ContainerDied","Data":"dd25e6eed2185244c3266f80de4527e19b59033887b482420b7e91eff48e7765"} Jan 10 17:15:50 crc kubenswrapper[5036]: I0110 17:15:50.201452 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd25e6eed2185244c3266f80de4527e19b59033887b482420b7e91eff48e7765" Jan 10 17:15:50 crc kubenswrapper[5036]: I0110 17:15:50.201618 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl" Jan 10 17:15:55 crc kubenswrapper[5036]: I0110 17:15:55.904424 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 17:15:55 crc kubenswrapper[5036]: I0110 17:15:55.905250 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.336159 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Jan 10 17:16:04 crc kubenswrapper[5036]: E0110 17:16:04.337297 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4da8068-8e5a-4624-b65f-05da63640d19" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.337317 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4da8068-8e5a-4624-b65f-05da63640d19" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Jan 10 17:16:04 crc kubenswrapper[5036]: E0110 17:16:04.337383 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0734fab-42f0-4005-a68b-82f939454d73" containerName="collect-profiles" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.337396 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0734fab-42f0-4005-a68b-82f939454d73" containerName="collect-profiles" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.337726 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0734fab-42f0-4005-a68b-82f939454d73" containerName="collect-profiles" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.337758 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4da8068-8e5a-4624-b65f-05da63640d19" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.339451 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.342033 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.359373 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.361916 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.451575 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e51ea81-c177-4dc1-a427-c3290a9e6010-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.451636 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e51ea81-c177-4dc1-a427-c3290a9e6010-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.451660 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.451693 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.451726 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.451793 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e51ea81-c177-4dc1-a427-c3290a9e6010-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.451812 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8dvl\" (UniqueName: \"kubernetes.io/projected/5e51ea81-c177-4dc1-a427-c3290a9e6010-kube-api-access-t8dvl\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.451839 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e51ea81-c177-4dc1-a427-c3290a9e6010-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.451859 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.451876 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-dev\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.451892 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.451910 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-sys\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.451925 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.451952 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-run\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.451966 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.451992 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e51ea81-c177-4dc1-a427-c3290a9e6010-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.500643 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.502324 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.508308 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.526804 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.553954 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.553998 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-dev\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.554019 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.554043 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-sys\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.554061 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.554094 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-run\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.554111 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.554139 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e51ea81-c177-4dc1-a427-c3290a9e6010-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.554192 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e51ea81-c177-4dc1-a427-c3290a9e6010-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.554214 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e51ea81-c177-4dc1-a427-c3290a9e6010-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.554236 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.554257 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.554288 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.554309 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e51ea81-c177-4dc1-a427-c3290a9e6010-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.554325 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8dvl\" (UniqueName: \"kubernetes.io/projected/5e51ea81-c177-4dc1-a427-c3290a9e6010-kube-api-access-t8dvl\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.554344 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e51ea81-c177-4dc1-a427-c3290a9e6010-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.555060 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.555290 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.555363 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-dev\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.555392 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.555417 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-sys\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.555484 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.555511 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-run\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.555533 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.556779 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.558947 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/5e51ea81-c177-4dc1-a427-c3290a9e6010-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.561483 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e51ea81-c177-4dc1-a427-c3290a9e6010-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.563501 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5e51ea81-c177-4dc1-a427-c3290a9e6010-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.566568 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e51ea81-c177-4dc1-a427-c3290a9e6010-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.568173 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5e51ea81-c177-4dc1-a427-c3290a9e6010-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.572902 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e51ea81-c177-4dc1-a427-c3290a9e6010-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.587497 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8dvl\" (UniqueName: \"kubernetes.io/projected/5e51ea81-c177-4dc1-a427-c3290a9e6010-kube-api-access-t8dvl\") pod \"cinder-volume-volume1-0\" (UID: \"5e51ea81-c177-4dc1-a427-c3290a9e6010\") " pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.655631 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-scripts\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.655698 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.655727 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-lib-modules\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.655754 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-sys\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.655804 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-etc-nvme\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.655829 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-config-data-custom\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.655876 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.655889 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.655944 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q292f\" (UniqueName: \"kubernetes.io/projected/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-kube-api-access-q292f\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.655978 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-ceph\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.655994 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-dev\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.656016 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-run\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.656039 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.656106 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-config-data\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.656143 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.656176 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.663076 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.757750 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.758097 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-config-data\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.758142 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.758172 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.758196 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-scripts\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.758211 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.758232 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-lib-modules\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.758250 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-sys\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.758284 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-etc-nvme\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.758300 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-config-data-custom\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.758325 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.758338 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.758363 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q292f\" (UniqueName: \"kubernetes.io/projected/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-kube-api-access-q292f\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.758383 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-ceph\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.758399 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-dev\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.758419 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-run\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.758502 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-run\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.757899 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.759252 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-etc-nvme\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.759318 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-lib-modules\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.759357 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-sys\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.759401 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.759408 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.759444 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.759849 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.759882 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-dev\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.764091 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.764115 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-config-data-custom\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.769951 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-ceph\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.771635 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-config-data\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.773609 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-scripts\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.779350 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q292f\" (UniqueName: \"kubernetes.io/projected/72df80ca-b881-4bc6-b6bc-816dccb6a4a6-kube-api-access-q292f\") pod \"cinder-backup-0\" (UID: \"72df80ca-b881-4bc6-b6bc-816dccb6a4a6\") " pod="openstack/cinder-backup-0" Jan 10 17:16:04 crc kubenswrapper[5036]: I0110 17:16:04.826331 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.046056 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-zfb2d"] Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.053102 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-zfb2d" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.059801 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-zfb2d"] Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.083534 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-78fc99bfc7-mctlg"] Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.085440 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78fc99bfc7-mctlg" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.090529 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.090781 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-f9tzx" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.090899 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.091046 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.122781 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-78fc99bfc7-mctlg"] Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.171697 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-config-data\") pod \"horizon-78fc99bfc7-mctlg\" (UID: \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\") " pod="openstack/horizon-78fc99bfc7-mctlg" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.171759 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x85rq\" (UniqueName: \"kubernetes.io/projected/01a3d231-ccaa-462b-a57b-b56b4e0f2921-kube-api-access-x85rq\") pod \"manila-db-create-zfb2d\" (UID: \"01a3d231-ccaa-462b-a57b-b56b4e0f2921\") " pod="openstack/manila-db-create-zfb2d" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.171785 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-horizon-secret-key\") pod \"horizon-78fc99bfc7-mctlg\" (UID: \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\") " pod="openstack/horizon-78fc99bfc7-mctlg" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.171826 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01a3d231-ccaa-462b-a57b-b56b4e0f2921-operator-scripts\") pod \"manila-db-create-zfb2d\" (UID: \"01a3d231-ccaa-462b-a57b-b56b4e0f2921\") " pod="openstack/manila-db-create-zfb2d" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.171860 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mggmx\" (UniqueName: \"kubernetes.io/projected/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-kube-api-access-mggmx\") pod \"horizon-78fc99bfc7-mctlg\" (UID: \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\") " pod="openstack/horizon-78fc99bfc7-mctlg" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.171900 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-scripts\") pod \"horizon-78fc99bfc7-mctlg\" (UID: \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\") " pod="openstack/horizon-78fc99bfc7-mctlg" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.172192 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-logs\") pod \"horizon-78fc99bfc7-mctlg\" (UID: \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\") " pod="openstack/horizon-78fc99bfc7-mctlg" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.176892 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.178402 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.185900 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.186564 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-kspfm" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.192014 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.196854 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.215490 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.231377 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-97e9-account-create-update-c6fxd"] Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.233358 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-97e9-account-create-update-c6fxd" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.247639 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-77c6f556bf-gmpft"] Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.248494 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.250150 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77c6f556bf-gmpft" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.272032 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-97e9-account-create-update-c6fxd"] Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.274663 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/146829fe-d847-4236-9015-fdaaea944887-ceph\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.274750 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x85rq\" (UniqueName: \"kubernetes.io/projected/01a3d231-ccaa-462b-a57b-b56b4e0f2921-kube-api-access-x85rq\") pod \"manila-db-create-zfb2d\" (UID: \"01a3d231-ccaa-462b-a57b-b56b4e0f2921\") " pod="openstack/manila-db-create-zfb2d" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.274779 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-horizon-secret-key\") pod \"horizon-78fc99bfc7-mctlg\" (UID: \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\") " pod="openstack/horizon-78fc99bfc7-mctlg" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.274872 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/146829fe-d847-4236-9015-fdaaea944887-config-data\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.274923 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/146829fe-d847-4236-9015-fdaaea944887-scripts\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.274997 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd7f6\" (UniqueName: \"kubernetes.io/projected/146829fe-d847-4236-9015-fdaaea944887-kube-api-access-hd7f6\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.275040 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01a3d231-ccaa-462b-a57b-b56b4e0f2921-operator-scripts\") pod \"manila-db-create-zfb2d\" (UID: \"01a3d231-ccaa-462b-a57b-b56b4e0f2921\") " pod="openstack/manila-db-create-zfb2d" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.275110 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146829fe-d847-4236-9015-fdaaea944887-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.275135 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mggmx\" (UniqueName: \"kubernetes.io/projected/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-kube-api-access-mggmx\") pod \"horizon-78fc99bfc7-mctlg\" (UID: \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\") " pod="openstack/horizon-78fc99bfc7-mctlg" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.275221 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-scripts\") pod \"horizon-78fc99bfc7-mctlg\" (UID: \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\") " pod="openstack/horizon-78fc99bfc7-mctlg" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.275252 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/146829fe-d847-4236-9015-fdaaea944887-logs\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.275325 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/146829fe-d847-4236-9015-fdaaea944887-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.276212 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01a3d231-ccaa-462b-a57b-b56b4e0f2921-operator-scripts\") pod \"manila-db-create-zfb2d\" (UID: \"01a3d231-ccaa-462b-a57b-b56b4e0f2921\") " pod="openstack/manila-db-create-zfb2d" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.276349 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-logs\") pod \"horizon-78fc99bfc7-mctlg\" (UID: \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\") " pod="openstack/horizon-78fc99bfc7-mctlg" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.276431 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-scripts\") pod \"horizon-78fc99bfc7-mctlg\" (UID: \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\") " pod="openstack/horizon-78fc99bfc7-mctlg" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.276618 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-logs\") pod \"horizon-78fc99bfc7-mctlg\" (UID: \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\") " pod="openstack/horizon-78fc99bfc7-mctlg" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.276689 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.276718 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/146829fe-d847-4236-9015-fdaaea944887-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.276866 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-config-data\") pod \"horizon-78fc99bfc7-mctlg\" (UID: \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\") " pod="openstack/horizon-78fc99bfc7-mctlg" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.278261 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-config-data\") pod \"horizon-78fc99bfc7-mctlg\" (UID: \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\") " pod="openstack/horizon-78fc99bfc7-mctlg" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.286658 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-horizon-secret-key\") pod \"horizon-78fc99bfc7-mctlg\" (UID: \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\") " pod="openstack/horizon-78fc99bfc7-mctlg" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.290669 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77c6f556bf-gmpft"] Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.303613 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mggmx\" (UniqueName: \"kubernetes.io/projected/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-kube-api-access-mggmx\") pod \"horizon-78fc99bfc7-mctlg\" (UID: \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\") " pod="openstack/horizon-78fc99bfc7-mctlg" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.320997 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x85rq\" (UniqueName: \"kubernetes.io/projected/01a3d231-ccaa-462b-a57b-b56b4e0f2921-kube-api-access-x85rq\") pod \"manila-db-create-zfb2d\" (UID: \"01a3d231-ccaa-462b-a57b-b56b4e0f2921\") " pod="openstack/manila-db-create-zfb2d" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.341435 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.342921 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.345652 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.345903 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.359830 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.367716 5036 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.378598 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd7a2aaf-b591-4388-aaf0-f94c930032b5-config-data\") pod \"horizon-77c6f556bf-gmpft\" (UID: \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\") " pod="openstack/horizon-77c6f556bf-gmpft" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.378632 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/146829fe-d847-4236-9015-fdaaea944887-ceph\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.378654 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/146829fe-d847-4236-9015-fdaaea944887-config-data\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.378673 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/146829fe-d847-4236-9015-fdaaea944887-scripts\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.378758 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd7f6\" (UniqueName: \"kubernetes.io/projected/146829fe-d847-4236-9015-fdaaea944887-kube-api-access-hd7f6\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.378794 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146829fe-d847-4236-9015-fdaaea944887-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.378819 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd7a2aaf-b591-4388-aaf0-f94c930032b5-scripts\") pod \"horizon-77c6f556bf-gmpft\" (UID: \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\") " pod="openstack/horizon-77c6f556bf-gmpft" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.378846 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9nrq\" (UniqueName: \"kubernetes.io/projected/dd7a2aaf-b591-4388-aaf0-f94c930032b5-kube-api-access-s9nrq\") pod \"horizon-77c6f556bf-gmpft\" (UID: \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\") " pod="openstack/horizon-77c6f556bf-gmpft" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.378865 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/146829fe-d847-4236-9015-fdaaea944887-logs\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.378888 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d460130e-a99b-46ab-b4d5-fa9528b70515-operator-scripts\") pod \"manila-97e9-account-create-update-c6fxd\" (UID: \"d460130e-a99b-46ab-b4d5-fa9528b70515\") " pod="openstack/manila-97e9-account-create-update-c6fxd" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.378913 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/146829fe-d847-4236-9015-fdaaea944887-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.378928 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd7a2aaf-b591-4388-aaf0-f94c930032b5-logs\") pod \"horizon-77c6f556bf-gmpft\" (UID: \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\") " pod="openstack/horizon-77c6f556bf-gmpft" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.378956 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll2bc\" (UniqueName: \"kubernetes.io/projected/d460130e-a99b-46ab-b4d5-fa9528b70515-kube-api-access-ll2bc\") pod \"manila-97e9-account-create-update-c6fxd\" (UID: \"d460130e-a99b-46ab-b4d5-fa9528b70515\") " pod="openstack/manila-97e9-account-create-update-c6fxd" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.378978 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.378994 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd7a2aaf-b591-4388-aaf0-f94c930032b5-horizon-secret-key\") pod \"horizon-77c6f556bf-gmpft\" (UID: \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\") " pod="openstack/horizon-77c6f556bf-gmpft" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.379012 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/146829fe-d847-4236-9015-fdaaea944887-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.379404 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/146829fe-d847-4236-9015-fdaaea944887-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.379712 5036 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.379826 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/146829fe-d847-4236-9015-fdaaea944887-logs\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.382080 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"5e51ea81-c177-4dc1-a427-c3290a9e6010","Type":"ContainerStarted","Data":"9d383754228c171b8a4af49fef764938181edef8ede47f943b2a1510d299437c"} Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.385497 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/146829fe-d847-4236-9015-fdaaea944887-scripts\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.385672 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/146829fe-d847-4236-9015-fdaaea944887-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.385705 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/146829fe-d847-4236-9015-fdaaea944887-config-data\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.388445 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146829fe-d847-4236-9015-fdaaea944887-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.388880 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-zfb2d" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.400781 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd7f6\" (UniqueName: \"kubernetes.io/projected/146829fe-d847-4236-9015-fdaaea944887-kube-api-access-hd7f6\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.402671 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78fc99bfc7-mctlg" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.406460 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.413986 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/146829fe-d847-4236-9015-fdaaea944887-ceph\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.444197 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.458884 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.481125 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd7a2aaf-b591-4388-aaf0-f94c930032b5-logs\") pod \"horizon-77c6f556bf-gmpft\" (UID: \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\") " pod="openstack/horizon-77c6f556bf-gmpft" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.481168 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-ceph\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.481196 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.481222 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll2bc\" (UniqueName: \"kubernetes.io/projected/d460130e-a99b-46ab-b4d5-fa9528b70515-kube-api-access-ll2bc\") pod \"manila-97e9-account-create-update-c6fxd\" (UID: \"d460130e-a99b-46ab-b4d5-fa9528b70515\") " pod="openstack/manila-97e9-account-create-update-c6fxd" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.481244 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd7a2aaf-b591-4388-aaf0-f94c930032b5-horizon-secret-key\") pod \"horizon-77c6f556bf-gmpft\" (UID: \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\") " pod="openstack/horizon-77c6f556bf-gmpft" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.481314 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd7a2aaf-b591-4388-aaf0-f94c930032b5-config-data\") pod \"horizon-77c6f556bf-gmpft\" (UID: \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\") " pod="openstack/horizon-77c6f556bf-gmpft" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.481349 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.481376 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-logs\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.481396 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.481422 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.481439 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd7a2aaf-b591-4388-aaf0-f94c930032b5-scripts\") pod \"horizon-77c6f556bf-gmpft\" (UID: \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\") " pod="openstack/horizon-77c6f556bf-gmpft" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.481464 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9nrq\" (UniqueName: \"kubernetes.io/projected/dd7a2aaf-b591-4388-aaf0-f94c930032b5-kube-api-access-s9nrq\") pod \"horizon-77c6f556bf-gmpft\" (UID: \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\") " pod="openstack/horizon-77c6f556bf-gmpft" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.481481 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.481502 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csk99\" (UniqueName: \"kubernetes.io/projected/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-kube-api-access-csk99\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.481534 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d460130e-a99b-46ab-b4d5-fa9528b70515-operator-scripts\") pod \"manila-97e9-account-create-update-c6fxd\" (UID: \"d460130e-a99b-46ab-b4d5-fa9528b70515\") " pod="openstack/manila-97e9-account-create-update-c6fxd" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.481558 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.481983 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd7a2aaf-b591-4388-aaf0-f94c930032b5-logs\") pod \"horizon-77c6f556bf-gmpft\" (UID: \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\") " pod="openstack/horizon-77c6f556bf-gmpft" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.482593 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd7a2aaf-b591-4388-aaf0-f94c930032b5-scripts\") pod \"horizon-77c6f556bf-gmpft\" (UID: \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\") " pod="openstack/horizon-77c6f556bf-gmpft" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.483331 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d460130e-a99b-46ab-b4d5-fa9528b70515-operator-scripts\") pod \"manila-97e9-account-create-update-c6fxd\" (UID: \"d460130e-a99b-46ab-b4d5-fa9528b70515\") " pod="openstack/manila-97e9-account-create-update-c6fxd" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.484047 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd7a2aaf-b591-4388-aaf0-f94c930032b5-config-data\") pod \"horizon-77c6f556bf-gmpft\" (UID: \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\") " pod="openstack/horizon-77c6f556bf-gmpft" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.486778 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd7a2aaf-b591-4388-aaf0-f94c930032b5-horizon-secret-key\") pod \"horizon-77c6f556bf-gmpft\" (UID: \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\") " pod="openstack/horizon-77c6f556bf-gmpft" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.497957 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll2bc\" (UniqueName: \"kubernetes.io/projected/d460130e-a99b-46ab-b4d5-fa9528b70515-kube-api-access-ll2bc\") pod \"manila-97e9-account-create-update-c6fxd\" (UID: \"d460130e-a99b-46ab-b4d5-fa9528b70515\") " pod="openstack/manila-97e9-account-create-update-c6fxd" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.500527 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9nrq\" (UniqueName: \"kubernetes.io/projected/dd7a2aaf-b591-4388-aaf0-f94c930032b5-kube-api-access-s9nrq\") pod \"horizon-77c6f556bf-gmpft\" (UID: \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\") " pod="openstack/horizon-77c6f556bf-gmpft" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.508906 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.563251 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-97e9-account-create-update-c6fxd" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.582960 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.583010 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-logs\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.583035 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.583055 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.583165 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.586428 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csk99\" (UniqueName: \"kubernetes.io/projected/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-kube-api-access-csk99\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.586465 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.586501 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-ceph\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.586529 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.588428 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77c6f556bf-gmpft" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.588869 5036 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.589866 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-logs\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.590351 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.591733 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-ceph\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.591956 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.591526 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.594628 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.615448 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csk99\" (UniqueName: \"kubernetes.io/projected/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-kube-api-access-csk99\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.621483 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.651506 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.690715 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.933901 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-zfb2d"] Jan 10 17:16:05 crc kubenswrapper[5036]: W0110 17:16:05.937349 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01a3d231_ccaa_462b_a57b_b56b4e0f2921.slice/crio-0a4e60f93cc7f07fd29d09f91500ef9bed7ce2545001bea3edb882662d5fa0ee WatchSource:0}: Error finding container 0a4e60f93cc7f07fd29d09f91500ef9bed7ce2545001bea3edb882662d5fa0ee: Status 404 returned error can't find the container with id 0a4e60f93cc7f07fd29d09f91500ef9bed7ce2545001bea3edb882662d5fa0ee Jan 10 17:16:05 crc kubenswrapper[5036]: I0110 17:16:05.994568 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-78fc99bfc7-mctlg"] Jan 10 17:16:06 crc kubenswrapper[5036]: I0110 17:16:06.010641 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 10 17:16:06 crc kubenswrapper[5036]: I0110 17:16:06.247368 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77c6f556bf-gmpft"] Jan 10 17:16:06 crc kubenswrapper[5036]: I0110 17:16:06.315198 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-97e9-account-create-update-c6fxd"] Jan 10 17:16:06 crc kubenswrapper[5036]: I0110 17:16:06.393126 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77c6f556bf-gmpft" event={"ID":"dd7a2aaf-b591-4388-aaf0-f94c930032b5","Type":"ContainerStarted","Data":"1b50597387f9b9041d612d5d50cdff3a2e840456665674b4b85ba9b608c75ab9"} Jan 10 17:16:06 crc kubenswrapper[5036]: I0110 17:16:06.397478 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-zfb2d" event={"ID":"01a3d231-ccaa-462b-a57b-b56b4e0f2921","Type":"ContainerStarted","Data":"5fd8730449515662f761080759c647c5bf24d32a8338fc12b78ae59e23c1234e"} Jan 10 17:16:06 crc kubenswrapper[5036]: I0110 17:16:06.397529 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-zfb2d" event={"ID":"01a3d231-ccaa-462b-a57b-b56b4e0f2921","Type":"ContainerStarted","Data":"0a4e60f93cc7f07fd29d09f91500ef9bed7ce2545001bea3edb882662d5fa0ee"} Jan 10 17:16:06 crc kubenswrapper[5036]: I0110 17:16:06.398969 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78fc99bfc7-mctlg" event={"ID":"d4911ee6-6ecd-40da-a12b-f6a79cdaa201","Type":"ContainerStarted","Data":"66c1fae10f0f71a2f03b6a7963b9404b90bfeec08c79d9a82af2a2db43f3f5b2"} Jan 10 17:16:06 crc kubenswrapper[5036]: I0110 17:16:06.401436 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-97e9-account-create-update-c6fxd" event={"ID":"d460130e-a99b-46ab-b4d5-fa9528b70515","Type":"ContainerStarted","Data":"47a8630eb055580d7d688088fdf4f7e1603b4f2564dee7ecc4805a423a2e9ab9"} Jan 10 17:16:06 crc kubenswrapper[5036]: I0110 17:16:06.402039 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 10 17:16:06 crc kubenswrapper[5036]: I0110 17:16:06.404357 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"146829fe-d847-4236-9015-fdaaea944887","Type":"ContainerStarted","Data":"f38777d399b1c60f584ff06ac0185d4d2bdb9fe419a838a494631ed4142ae71f"} Jan 10 17:16:06 crc kubenswrapper[5036]: I0110 17:16:06.409732 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"72df80ca-b881-4bc6-b6bc-816dccb6a4a6","Type":"ContainerStarted","Data":"cb05e8425dd3086262cea548bf9231fb18079afda91d36aa1ed97bbc341bf58c"} Jan 10 17:16:06 crc kubenswrapper[5036]: W0110 17:16:06.461511 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49c19b29_dfcc_4ff3_ac70_6c499f72c1bd.slice/crio-04e4ab9705f33a6123a967b4c0e3e5397e67016c8d80192b9aa9b8df7816ff57 WatchSource:0}: Error finding container 04e4ab9705f33a6123a967b4c0e3e5397e67016c8d80192b9aa9b8df7816ff57: Status 404 returned error can't find the container with id 04e4ab9705f33a6123a967b4c0e3e5397e67016c8d80192b9aa9b8df7816ff57 Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.423403 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-77c6f556bf-gmpft"] Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.448665 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-79f74f6ffb-kzjrv"] Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.453804 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.463105 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.492243 5036 generic.go:334] "Generic (PLEG): container finished" podID="01a3d231-ccaa-462b-a57b-b56b4e0f2921" containerID="5fd8730449515662f761080759c647c5bf24d32a8338fc12b78ae59e23c1234e" exitCode=0 Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.492426 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-zfb2d" event={"ID":"01a3d231-ccaa-462b-a57b-b56b4e0f2921","Type":"ContainerDied","Data":"5fd8730449515662f761080759c647c5bf24d32a8338fc12b78ae59e23c1234e"} Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.500026 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.509371 5036 generic.go:334] "Generic (PLEG): container finished" podID="d460130e-a99b-46ab-b4d5-fa9528b70515" containerID="dfda14caa23bfac8209039b597c8208d37117f50a8a51807d99ab3470254e9b6" exitCode=0 Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.509410 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-97e9-account-create-update-c6fxd" event={"ID":"d460130e-a99b-46ab-b4d5-fa9528b70515","Type":"ContainerDied","Data":"dfda14caa23bfac8209039b597c8208d37117f50a8a51807d99ab3470254e9b6"} Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.519540 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd","Type":"ContainerStarted","Data":"93e61743d52773d2c4ab92d10e57fe98f6ee9c169b9c2410ab6390ac36901f96"} Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.519587 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd","Type":"ContainerStarted","Data":"04e4ab9705f33a6123a967b4c0e3e5397e67016c8d80192b9aa9b8df7816ff57"} Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.532669 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79f74f6ffb-kzjrv"] Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.546736 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-scripts\") pod \"horizon-79f74f6ffb-kzjrv\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.546791 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-horizon-tls-certs\") pod \"horizon-79f74f6ffb-kzjrv\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.546826 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhqqj\" (UniqueName: \"kubernetes.io/projected/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-kube-api-access-bhqqj\") pod \"horizon-79f74f6ffb-kzjrv\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.546849 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-config-data\") pod \"horizon-79f74f6ffb-kzjrv\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.546901 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-horizon-secret-key\") pod \"horizon-79f74f6ffb-kzjrv\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.546983 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-logs\") pod \"horizon-79f74f6ffb-kzjrv\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.547020 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-combined-ca-bundle\") pod \"horizon-79f74f6ffb-kzjrv\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.556135 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.573197 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-78fc99bfc7-mctlg"] Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.580167 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5bcc8455c4-njd4j"] Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.582368 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.594128 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"146829fe-d847-4236-9015-fdaaea944887","Type":"ContainerStarted","Data":"64eb659e8d606bf5f5eedb6c0fa725f1ff1a14ac19df9c349ad00a934fcfb721"} Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.600097 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5bcc8455c4-njd4j"] Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.604077 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"72df80ca-b881-4bc6-b6bc-816dccb6a4a6","Type":"ContainerStarted","Data":"498719b04df261720ddab420d1b9f356519c47ac631c45b3fc74cffd45b19d48"} Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.604120 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"72df80ca-b881-4bc6-b6bc-816dccb6a4a6","Type":"ContainerStarted","Data":"d59615dd0aff45809038697fa9ad541287098202a483065a545fbeb9bf52edca"} Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.621535 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"5e51ea81-c177-4dc1-a427-c3290a9e6010","Type":"ContainerStarted","Data":"01ce700c8fb15c615106b8f7801f10e58c4039c357a79eb3704a78dadb48f75d"} Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.621573 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"5e51ea81-c177-4dc1-a427-c3290a9e6010","Type":"ContainerStarted","Data":"f3973ca48897b148e1466ddfc7dc117b485db520f55d65192155c57ef63a4460"} Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.649014 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crwwp\" (UniqueName: \"kubernetes.io/projected/e92a2ceb-4619-4207-a2a3-b6c588674ab8-kube-api-access-crwwp\") pod \"horizon-5bcc8455c4-njd4j\" (UID: \"e92a2ceb-4619-4207-a2a3-b6c588674ab8\") " pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.649077 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-logs\") pod \"horizon-79f74f6ffb-kzjrv\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.649118 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-combined-ca-bundle\") pod \"horizon-79f74f6ffb-kzjrv\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.649142 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e92a2ceb-4619-4207-a2a3-b6c588674ab8-combined-ca-bundle\") pod \"horizon-5bcc8455c4-njd4j\" (UID: \"e92a2ceb-4619-4207-a2a3-b6c588674ab8\") " pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.649167 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-scripts\") pod \"horizon-79f74f6ffb-kzjrv\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.649189 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e92a2ceb-4619-4207-a2a3-b6c588674ab8-scripts\") pod \"horizon-5bcc8455c4-njd4j\" (UID: \"e92a2ceb-4619-4207-a2a3-b6c588674ab8\") " pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.649209 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e92a2ceb-4619-4207-a2a3-b6c588674ab8-horizon-secret-key\") pod \"horizon-5bcc8455c4-njd4j\" (UID: \"e92a2ceb-4619-4207-a2a3-b6c588674ab8\") " pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.649270 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e92a2ceb-4619-4207-a2a3-b6c588674ab8-horizon-tls-certs\") pod \"horizon-5bcc8455c4-njd4j\" (UID: \"e92a2ceb-4619-4207-a2a3-b6c588674ab8\") " pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.649295 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-horizon-tls-certs\") pod \"horizon-79f74f6ffb-kzjrv\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.649329 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhqqj\" (UniqueName: \"kubernetes.io/projected/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-kube-api-access-bhqqj\") pod \"horizon-79f74f6ffb-kzjrv\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.649349 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-config-data\") pod \"horizon-79f74f6ffb-kzjrv\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.649402 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e92a2ceb-4619-4207-a2a3-b6c588674ab8-logs\") pod \"horizon-5bcc8455c4-njd4j\" (UID: \"e92a2ceb-4619-4207-a2a3-b6c588674ab8\") " pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.649421 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-horizon-secret-key\") pod \"horizon-79f74f6ffb-kzjrv\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.649488 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e92a2ceb-4619-4207-a2a3-b6c588674ab8-config-data\") pod \"horizon-5bcc8455c4-njd4j\" (UID: \"e92a2ceb-4619-4207-a2a3-b6c588674ab8\") " pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.650003 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-logs\") pod \"horizon-79f74f6ffb-kzjrv\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.651631 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-config-data\") pod \"horizon-79f74f6ffb-kzjrv\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.652421 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-scripts\") pod \"horizon-79f74f6ffb-kzjrv\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.654432 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-horizon-secret-key\") pod \"horizon-79f74f6ffb-kzjrv\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.665750 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-combined-ca-bundle\") pod \"horizon-79f74f6ffb-kzjrv\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.666201 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-horizon-tls-certs\") pod \"horizon-79f74f6ffb-kzjrv\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.667557 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.610541847 podStartE2EDuration="3.667540961s" podCreationTimestamp="2026-01-10 17:16:04 +0000 UTC" firstStartedPulling="2026-01-10 17:16:05.464921683 +0000 UTC m=+2887.335157177" lastFinishedPulling="2026-01-10 17:16:06.521920797 +0000 UTC m=+2888.392156291" observedRunningTime="2026-01-10 17:16:07.634327594 +0000 UTC m=+2889.504563088" watchObservedRunningTime="2026-01-10 17:16:07.667540961 +0000 UTC m=+2889.537776455" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.675890 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhqqj\" (UniqueName: \"kubernetes.io/projected/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-kube-api-access-bhqqj\") pod \"horizon-79f74f6ffb-kzjrv\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.680201 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.845450535 podStartE2EDuration="3.680182782s" podCreationTimestamp="2026-01-10 17:16:04 +0000 UTC" firstStartedPulling="2026-01-10 17:16:05.367509171 +0000 UTC m=+2887.237744665" lastFinishedPulling="2026-01-10 17:16:06.202241418 +0000 UTC m=+2888.072476912" observedRunningTime="2026-01-10 17:16:07.663575148 +0000 UTC m=+2889.533810642" watchObservedRunningTime="2026-01-10 17:16:07.680182782 +0000 UTC m=+2889.550418276" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.753184 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e92a2ceb-4619-4207-a2a3-b6c588674ab8-config-data\") pod \"horizon-5bcc8455c4-njd4j\" (UID: \"e92a2ceb-4619-4207-a2a3-b6c588674ab8\") " pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.760990 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crwwp\" (UniqueName: \"kubernetes.io/projected/e92a2ceb-4619-4207-a2a3-b6c588674ab8-kube-api-access-crwwp\") pod \"horizon-5bcc8455c4-njd4j\" (UID: \"e92a2ceb-4619-4207-a2a3-b6c588674ab8\") " pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.755459 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e92a2ceb-4619-4207-a2a3-b6c588674ab8-config-data\") pod \"horizon-5bcc8455c4-njd4j\" (UID: \"e92a2ceb-4619-4207-a2a3-b6c588674ab8\") " pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.771008 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e92a2ceb-4619-4207-a2a3-b6c588674ab8-combined-ca-bundle\") pod \"horizon-5bcc8455c4-njd4j\" (UID: \"e92a2ceb-4619-4207-a2a3-b6c588674ab8\") " pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.771124 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e92a2ceb-4619-4207-a2a3-b6c588674ab8-scripts\") pod \"horizon-5bcc8455c4-njd4j\" (UID: \"e92a2ceb-4619-4207-a2a3-b6c588674ab8\") " pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.771167 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e92a2ceb-4619-4207-a2a3-b6c588674ab8-horizon-secret-key\") pod \"horizon-5bcc8455c4-njd4j\" (UID: \"e92a2ceb-4619-4207-a2a3-b6c588674ab8\") " pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.771199 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e92a2ceb-4619-4207-a2a3-b6c588674ab8-horizon-tls-certs\") pod \"horizon-5bcc8455c4-njd4j\" (UID: \"e92a2ceb-4619-4207-a2a3-b6c588674ab8\") " pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.771509 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e92a2ceb-4619-4207-a2a3-b6c588674ab8-logs\") pod \"horizon-5bcc8455c4-njd4j\" (UID: \"e92a2ceb-4619-4207-a2a3-b6c588674ab8\") " pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.773274 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e92a2ceb-4619-4207-a2a3-b6c588674ab8-logs\") pod \"horizon-5bcc8455c4-njd4j\" (UID: \"e92a2ceb-4619-4207-a2a3-b6c588674ab8\") " pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.774609 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e92a2ceb-4619-4207-a2a3-b6c588674ab8-scripts\") pod \"horizon-5bcc8455c4-njd4j\" (UID: \"e92a2ceb-4619-4207-a2a3-b6c588674ab8\") " pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.780184 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e92a2ceb-4619-4207-a2a3-b6c588674ab8-horizon-secret-key\") pod \"horizon-5bcc8455c4-njd4j\" (UID: \"e92a2ceb-4619-4207-a2a3-b6c588674ab8\") " pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.780398 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e92a2ceb-4619-4207-a2a3-b6c588674ab8-horizon-tls-certs\") pod \"horizon-5bcc8455c4-njd4j\" (UID: \"e92a2ceb-4619-4207-a2a3-b6c588674ab8\") " pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.795519 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crwwp\" (UniqueName: \"kubernetes.io/projected/e92a2ceb-4619-4207-a2a3-b6c588674ab8-kube-api-access-crwwp\") pod \"horizon-5bcc8455c4-njd4j\" (UID: \"e92a2ceb-4619-4207-a2a3-b6c588674ab8\") " pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.796658 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.797015 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e92a2ceb-4619-4207-a2a3-b6c588674ab8-combined-ca-bundle\") pod \"horizon-5bcc8455c4-njd4j\" (UID: \"e92a2ceb-4619-4207-a2a3-b6c588674ab8\") " pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.865481 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:07 crc kubenswrapper[5036]: I0110 17:16:07.996222 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-zfb2d" Jan 10 17:16:08 crc kubenswrapper[5036]: I0110 17:16:08.080924 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x85rq\" (UniqueName: \"kubernetes.io/projected/01a3d231-ccaa-462b-a57b-b56b4e0f2921-kube-api-access-x85rq\") pod \"01a3d231-ccaa-462b-a57b-b56b4e0f2921\" (UID: \"01a3d231-ccaa-462b-a57b-b56b4e0f2921\") " Jan 10 17:16:08 crc kubenswrapper[5036]: I0110 17:16:08.080981 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01a3d231-ccaa-462b-a57b-b56b4e0f2921-operator-scripts\") pod \"01a3d231-ccaa-462b-a57b-b56b4e0f2921\" (UID: \"01a3d231-ccaa-462b-a57b-b56b4e0f2921\") " Jan 10 17:16:08 crc kubenswrapper[5036]: I0110 17:16:08.083154 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01a3d231-ccaa-462b-a57b-b56b4e0f2921-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "01a3d231-ccaa-462b-a57b-b56b4e0f2921" (UID: "01a3d231-ccaa-462b-a57b-b56b4e0f2921"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 17:16:08 crc kubenswrapper[5036]: I0110 17:16:08.089523 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01a3d231-ccaa-462b-a57b-b56b4e0f2921-kube-api-access-x85rq" (OuterVolumeSpecName: "kube-api-access-x85rq") pod "01a3d231-ccaa-462b-a57b-b56b4e0f2921" (UID: "01a3d231-ccaa-462b-a57b-b56b4e0f2921"). InnerVolumeSpecName "kube-api-access-x85rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:16:08 crc kubenswrapper[5036]: I0110 17:16:08.183771 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x85rq\" (UniqueName: \"kubernetes.io/projected/01a3d231-ccaa-462b-a57b-b56b4e0f2921-kube-api-access-x85rq\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:08 crc kubenswrapper[5036]: I0110 17:16:08.184079 5036 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01a3d231-ccaa-462b-a57b-b56b4e0f2921-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:08 crc kubenswrapper[5036]: I0110 17:16:08.424593 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79f74f6ffb-kzjrv"] Jan 10 17:16:08 crc kubenswrapper[5036]: W0110 17:16:08.453919 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7d588d2_de3c_4aa8_9949_cd2cc17beac6.slice/crio-76735ad99e29f1c250c21ab7a41923dfba7b3779471e4e7ed76b0d99da909fef WatchSource:0}: Error finding container 76735ad99e29f1c250c21ab7a41923dfba7b3779471e4e7ed76b0d99da909fef: Status 404 returned error can't find the container with id 76735ad99e29f1c250c21ab7a41923dfba7b3779471e4e7ed76b0d99da909fef Jan 10 17:16:08 crc kubenswrapper[5036]: I0110 17:16:08.648250 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"146829fe-d847-4236-9015-fdaaea944887","Type":"ContainerStarted","Data":"c7120baa7230c852d25345292e4119c7d481e3a55fbe139650aea1e1424f390c"} Jan 10 17:16:08 crc kubenswrapper[5036]: I0110 17:16:08.648859 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="146829fe-d847-4236-9015-fdaaea944887" containerName="glance-log" containerID="cri-o://64eb659e8d606bf5f5eedb6c0fa725f1ff1a14ac19df9c349ad00a934fcfb721" gracePeriod=30 Jan 10 17:16:08 crc kubenswrapper[5036]: I0110 17:16:08.648936 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="146829fe-d847-4236-9015-fdaaea944887" containerName="glance-httpd" containerID="cri-o://c7120baa7230c852d25345292e4119c7d481e3a55fbe139650aea1e1424f390c" gracePeriod=30 Jan 10 17:16:08 crc kubenswrapper[5036]: I0110 17:16:08.655691 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79f74f6ffb-kzjrv" event={"ID":"b7d588d2-de3c-4aa8-9949-cd2cc17beac6","Type":"ContainerStarted","Data":"76735ad99e29f1c250c21ab7a41923dfba7b3779471e4e7ed76b0d99da909fef"} Jan 10 17:16:08 crc kubenswrapper[5036]: I0110 17:16:08.670855 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-zfb2d" Jan 10 17:16:08 crc kubenswrapper[5036]: I0110 17:16:08.671820 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-zfb2d" event={"ID":"01a3d231-ccaa-462b-a57b-b56b4e0f2921","Type":"ContainerDied","Data":"0a4e60f93cc7f07fd29d09f91500ef9bed7ce2545001bea3edb882662d5fa0ee"} Jan 10 17:16:08 crc kubenswrapper[5036]: I0110 17:16:08.671869 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a4e60f93cc7f07fd29d09f91500ef9bed7ce2545001bea3edb882662d5fa0ee" Jan 10 17:16:08 crc kubenswrapper[5036]: I0110 17:16:08.682767 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd","Type":"ContainerStarted","Data":"e827be3d9fe6d83b9268b07f4b381860a8d9075fbf1f5e731bab6a1e8eb6f3e0"} Jan 10 17:16:08 crc kubenswrapper[5036]: I0110 17:16:08.682844 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5bcc8455c4-njd4j"] Jan 10 17:16:08 crc kubenswrapper[5036]: I0110 17:16:08.682993 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="49c19b29-dfcc-4ff3-ac70-6c499f72c1bd" containerName="glance-log" containerID="cri-o://93e61743d52773d2c4ab92d10e57fe98f6ee9c169b9c2410ab6390ac36901f96" gracePeriod=30 Jan 10 17:16:08 crc kubenswrapper[5036]: I0110 17:16:08.683538 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="49c19b29-dfcc-4ff3-ac70-6c499f72c1bd" containerName="glance-httpd" containerID="cri-o://e827be3d9fe6d83b9268b07f4b381860a8d9075fbf1f5e731bab6a1e8eb6f3e0" gracePeriod=30 Jan 10 17:16:08 crc kubenswrapper[5036]: I0110 17:16:08.687394 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.687369134 podStartE2EDuration="3.687369134s" podCreationTimestamp="2026-01-10 17:16:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 17:16:08.667995721 +0000 UTC m=+2890.538231215" watchObservedRunningTime="2026-01-10 17:16:08.687369134 +0000 UTC m=+2890.557604628" Jan 10 17:16:08 crc kubenswrapper[5036]: I0110 17:16:08.710968 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.710949367 podStartE2EDuration="3.710949367s" podCreationTimestamp="2026-01-10 17:16:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 17:16:08.706908482 +0000 UTC m=+2890.577143976" watchObservedRunningTime="2026-01-10 17:16:08.710949367 +0000 UTC m=+2890.581184861" Jan 10 17:16:08 crc kubenswrapper[5036]: W0110 17:16:08.720314 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode92a2ceb_4619_4207_a2a3_b6c588674ab8.slice/crio-30c54cf0032322407fcedf52c88abe2118402973d18b145d428efc2b55639529 WatchSource:0}: Error finding container 30c54cf0032322407fcedf52c88abe2118402973d18b145d428efc2b55639529: Status 404 returned error can't find the container with id 30c54cf0032322407fcedf52c88abe2118402973d18b145d428efc2b55639529 Jan 10 17:16:08 crc kubenswrapper[5036]: I0110 17:16:08.966257 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-97e9-account-create-update-c6fxd" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.107193 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll2bc\" (UniqueName: \"kubernetes.io/projected/d460130e-a99b-46ab-b4d5-fa9528b70515-kube-api-access-ll2bc\") pod \"d460130e-a99b-46ab-b4d5-fa9528b70515\" (UID: \"d460130e-a99b-46ab-b4d5-fa9528b70515\") " Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.107310 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d460130e-a99b-46ab-b4d5-fa9528b70515-operator-scripts\") pod \"d460130e-a99b-46ab-b4d5-fa9528b70515\" (UID: \"d460130e-a99b-46ab-b4d5-fa9528b70515\") " Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.109047 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d460130e-a99b-46ab-b4d5-fa9528b70515-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d460130e-a99b-46ab-b4d5-fa9528b70515" (UID: "d460130e-a99b-46ab-b4d5-fa9528b70515"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.119381 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d460130e-a99b-46ab-b4d5-fa9528b70515-kube-api-access-ll2bc" (OuterVolumeSpecName: "kube-api-access-ll2bc") pod "d460130e-a99b-46ab-b4d5-fa9528b70515" (UID: "d460130e-a99b-46ab-b4d5-fa9528b70515"). InnerVolumeSpecName "kube-api-access-ll2bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.210458 5036 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d460130e-a99b-46ab-b4d5-fa9528b70515-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.210492 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll2bc\" (UniqueName: \"kubernetes.io/projected/d460130e-a99b-46ab-b4d5-fa9528b70515-kube-api-access-ll2bc\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.318557 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.414668 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.414852 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-combined-ca-bundle\") pod \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.414894 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csk99\" (UniqueName: \"kubernetes.io/projected/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-kube-api-access-csk99\") pod \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.414936 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-scripts\") pod \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.414953 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-internal-tls-certs\") pod \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.415007 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-ceph\") pod \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.415057 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-logs\") pod \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.415105 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-httpd-run\") pod \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.415145 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-config-data\") pod \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\" (UID: \"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd\") " Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.415679 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-logs" (OuterVolumeSpecName: "logs") pod "49c19b29-dfcc-4ff3-ac70-6c499f72c1bd" (UID: "49c19b29-dfcc-4ff3-ac70-6c499f72c1bd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.415887 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "49c19b29-dfcc-4ff3-ac70-6c499f72c1bd" (UID: "49c19b29-dfcc-4ff3-ac70-6c499f72c1bd"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.422366 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-scripts" (OuterVolumeSpecName: "scripts") pod "49c19b29-dfcc-4ff3-ac70-6c499f72c1bd" (UID: "49c19b29-dfcc-4ff3-ac70-6c499f72c1bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.422775 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "49c19b29-dfcc-4ff3-ac70-6c499f72c1bd" (UID: "49c19b29-dfcc-4ff3-ac70-6c499f72c1bd"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.423020 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-kube-api-access-csk99" (OuterVolumeSpecName: "kube-api-access-csk99") pod "49c19b29-dfcc-4ff3-ac70-6c499f72c1bd" (UID: "49c19b29-dfcc-4ff3-ac70-6c499f72c1bd"). InnerVolumeSpecName "kube-api-access-csk99". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.429596 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-ceph" (OuterVolumeSpecName: "ceph") pod "49c19b29-dfcc-4ff3-ac70-6c499f72c1bd" (UID: "49c19b29-dfcc-4ff3-ac70-6c499f72c1bd"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.470290 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49c19b29-dfcc-4ff3-ac70-6c499f72c1bd" (UID: "49c19b29-dfcc-4ff3-ac70-6c499f72c1bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.481008 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.484532 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-config-data" (OuterVolumeSpecName: "config-data") pod "49c19b29-dfcc-4ff3-ac70-6c499f72c1bd" (UID: "49c19b29-dfcc-4ff3-ac70-6c499f72c1bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.484641 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "49c19b29-dfcc-4ff3-ac70-6c499f72c1bd" (UID: "49c19b29-dfcc-4ff3-ac70-6c499f72c1bd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.517376 5036 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-ceph\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.517411 5036 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-logs\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.517420 5036 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.517430 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.517454 5036 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.517465 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.517477 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csk99\" (UniqueName: \"kubernetes.io/projected/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-kube-api-access-csk99\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.517485 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.517493 5036 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.538103 5036 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.618629 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/146829fe-d847-4236-9015-fdaaea944887-public-tls-certs\") pod \"146829fe-d847-4236-9015-fdaaea944887\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.618755 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146829fe-d847-4236-9015-fdaaea944887-combined-ca-bundle\") pod \"146829fe-d847-4236-9015-fdaaea944887\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.618804 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/146829fe-d847-4236-9015-fdaaea944887-httpd-run\") pod \"146829fe-d847-4236-9015-fdaaea944887\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.618862 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/146829fe-d847-4236-9015-fdaaea944887-scripts\") pod \"146829fe-d847-4236-9015-fdaaea944887\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.618883 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd7f6\" (UniqueName: \"kubernetes.io/projected/146829fe-d847-4236-9015-fdaaea944887-kube-api-access-hd7f6\") pod \"146829fe-d847-4236-9015-fdaaea944887\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.618899 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/146829fe-d847-4236-9015-fdaaea944887-ceph\") pod \"146829fe-d847-4236-9015-fdaaea944887\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.618931 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/146829fe-d847-4236-9015-fdaaea944887-logs\") pod \"146829fe-d847-4236-9015-fdaaea944887\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.618991 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"146829fe-d847-4236-9015-fdaaea944887\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.619033 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/146829fe-d847-4236-9015-fdaaea944887-config-data\") pod \"146829fe-d847-4236-9015-fdaaea944887\" (UID: \"146829fe-d847-4236-9015-fdaaea944887\") " Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.619530 5036 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.621388 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/146829fe-d847-4236-9015-fdaaea944887-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "146829fe-d847-4236-9015-fdaaea944887" (UID: "146829fe-d847-4236-9015-fdaaea944887"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.621406 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/146829fe-d847-4236-9015-fdaaea944887-logs" (OuterVolumeSpecName: "logs") pod "146829fe-d847-4236-9015-fdaaea944887" (UID: "146829fe-d847-4236-9015-fdaaea944887"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.638292 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/146829fe-d847-4236-9015-fdaaea944887-ceph" (OuterVolumeSpecName: "ceph") pod "146829fe-d847-4236-9015-fdaaea944887" (UID: "146829fe-d847-4236-9015-fdaaea944887"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.639829 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/146829fe-d847-4236-9015-fdaaea944887-kube-api-access-hd7f6" (OuterVolumeSpecName: "kube-api-access-hd7f6") pod "146829fe-d847-4236-9015-fdaaea944887" (UID: "146829fe-d847-4236-9015-fdaaea944887"). InnerVolumeSpecName "kube-api-access-hd7f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.640853 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "146829fe-d847-4236-9015-fdaaea944887" (UID: "146829fe-d847-4236-9015-fdaaea944887"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.646999 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146829fe-d847-4236-9015-fdaaea944887-scripts" (OuterVolumeSpecName: "scripts") pod "146829fe-d847-4236-9015-fdaaea944887" (UID: "146829fe-d847-4236-9015-fdaaea944887"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.664802 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.712897 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146829fe-d847-4236-9015-fdaaea944887-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "146829fe-d847-4236-9015-fdaaea944887" (UID: "146829fe-d847-4236-9015-fdaaea944887"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.721088 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/146829fe-d847-4236-9015-fdaaea944887-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.721117 5036 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/146829fe-d847-4236-9015-fdaaea944887-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.721126 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/146829fe-d847-4236-9015-fdaaea944887-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.721136 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd7f6\" (UniqueName: \"kubernetes.io/projected/146829fe-d847-4236-9015-fdaaea944887-kube-api-access-hd7f6\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.721147 5036 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/146829fe-d847-4236-9015-fdaaea944887-ceph\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.721155 5036 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/146829fe-d847-4236-9015-fdaaea944887-logs\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.721174 5036 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.725869 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146829fe-d847-4236-9015-fdaaea944887-config-data" (OuterVolumeSpecName: "config-data") pod "146829fe-d847-4236-9015-fdaaea944887" (UID: "146829fe-d847-4236-9015-fdaaea944887"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.727503 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bcc8455c4-njd4j" event={"ID":"e92a2ceb-4619-4207-a2a3-b6c588674ab8","Type":"ContainerStarted","Data":"30c54cf0032322407fcedf52c88abe2118402973d18b145d428efc2b55639529"} Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.729096 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-97e9-account-create-update-c6fxd" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.729735 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-97e9-account-create-update-c6fxd" event={"ID":"d460130e-a99b-46ab-b4d5-fa9528b70515","Type":"ContainerDied","Data":"47a8630eb055580d7d688088fdf4f7e1603b4f2564dee7ecc4805a423a2e9ab9"} Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.729777 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47a8630eb055580d7d688088fdf4f7e1603b4f2564dee7ecc4805a423a2e9ab9" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.731056 5036 generic.go:334] "Generic (PLEG): container finished" podID="49c19b29-dfcc-4ff3-ac70-6c499f72c1bd" containerID="e827be3d9fe6d83b9268b07f4b381860a8d9075fbf1f5e731bab6a1e8eb6f3e0" exitCode=143 Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.731077 5036 generic.go:334] "Generic (PLEG): container finished" podID="49c19b29-dfcc-4ff3-ac70-6c499f72c1bd" containerID="93e61743d52773d2c4ab92d10e57fe98f6ee9c169b9c2410ab6390ac36901f96" exitCode=143 Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.731105 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd","Type":"ContainerDied","Data":"e827be3d9fe6d83b9268b07f4b381860a8d9075fbf1f5e731bab6a1e8eb6f3e0"} Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.731123 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd","Type":"ContainerDied","Data":"93e61743d52773d2c4ab92d10e57fe98f6ee9c169b9c2410ab6390ac36901f96"} Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.731133 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"49c19b29-dfcc-4ff3-ac70-6c499f72c1bd","Type":"ContainerDied","Data":"04e4ab9705f33a6123a967b4c0e3e5397e67016c8d80192b9aa9b8df7816ff57"} Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.731148 5036 scope.go:117] "RemoveContainer" containerID="e827be3d9fe6d83b9268b07f4b381860a8d9075fbf1f5e731bab6a1e8eb6f3e0" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.731266 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.740295 5036 generic.go:334] "Generic (PLEG): container finished" podID="146829fe-d847-4236-9015-fdaaea944887" containerID="c7120baa7230c852d25345292e4119c7d481e3a55fbe139650aea1e1424f390c" exitCode=0 Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.740324 5036 generic.go:334] "Generic (PLEG): container finished" podID="146829fe-d847-4236-9015-fdaaea944887" containerID="64eb659e8d606bf5f5eedb6c0fa725f1ff1a14ac19df9c349ad00a934fcfb721" exitCode=143 Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.741266 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.741367 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"146829fe-d847-4236-9015-fdaaea944887","Type":"ContainerDied","Data":"c7120baa7230c852d25345292e4119c7d481e3a55fbe139650aea1e1424f390c"} Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.741415 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"146829fe-d847-4236-9015-fdaaea944887","Type":"ContainerDied","Data":"64eb659e8d606bf5f5eedb6c0fa725f1ff1a14ac19df9c349ad00a934fcfb721"} Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.741428 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"146829fe-d847-4236-9015-fdaaea944887","Type":"ContainerDied","Data":"f38777d399b1c60f584ff06ac0185d4d2bdb9fe419a838a494631ed4142ae71f"} Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.751015 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146829fe-d847-4236-9015-fdaaea944887-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "146829fe-d847-4236-9015-fdaaea944887" (UID: "146829fe-d847-4236-9015-fdaaea944887"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.759314 5036 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.823158 5036 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.823186 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/146829fe-d847-4236-9015-fdaaea944887-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.823199 5036 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/146829fe-d847-4236-9015-fdaaea944887-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.826586 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.848299 5036 scope.go:117] "RemoveContainer" containerID="93e61743d52773d2c4ab92d10e57fe98f6ee9c169b9c2410ab6390ac36901f96" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.859073 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.882884 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.891902 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 10 17:16:09 crc kubenswrapper[5036]: E0110 17:16:09.892303 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146829fe-d847-4236-9015-fdaaea944887" containerName="glance-log" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.892319 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="146829fe-d847-4236-9015-fdaaea944887" containerName="glance-log" Jan 10 17:16:09 crc kubenswrapper[5036]: E0110 17:16:09.892333 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146829fe-d847-4236-9015-fdaaea944887" containerName="glance-httpd" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.892340 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="146829fe-d847-4236-9015-fdaaea944887" containerName="glance-httpd" Jan 10 17:16:09 crc kubenswrapper[5036]: E0110 17:16:09.892353 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49c19b29-dfcc-4ff3-ac70-6c499f72c1bd" containerName="glance-httpd" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.892360 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="49c19b29-dfcc-4ff3-ac70-6c499f72c1bd" containerName="glance-httpd" Jan 10 17:16:09 crc kubenswrapper[5036]: E0110 17:16:09.892372 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a3d231-ccaa-462b-a57b-b56b4e0f2921" containerName="mariadb-database-create" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.892377 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a3d231-ccaa-462b-a57b-b56b4e0f2921" containerName="mariadb-database-create" Jan 10 17:16:09 crc kubenswrapper[5036]: E0110 17:16:09.892387 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d460130e-a99b-46ab-b4d5-fa9528b70515" containerName="mariadb-account-create-update" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.892392 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="d460130e-a99b-46ab-b4d5-fa9528b70515" containerName="mariadb-account-create-update" Jan 10 17:16:09 crc kubenswrapper[5036]: E0110 17:16:09.892417 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49c19b29-dfcc-4ff3-ac70-6c499f72c1bd" containerName="glance-log" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.892423 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="49c19b29-dfcc-4ff3-ac70-6c499f72c1bd" containerName="glance-log" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.892576 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="146829fe-d847-4236-9015-fdaaea944887" containerName="glance-log" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.892592 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="01a3d231-ccaa-462b-a57b-b56b4e0f2921" containerName="mariadb-database-create" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.892604 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="49c19b29-dfcc-4ff3-ac70-6c499f72c1bd" containerName="glance-log" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.892612 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="146829fe-d847-4236-9015-fdaaea944887" containerName="glance-httpd" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.892620 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="d460130e-a99b-46ab-b4d5-fa9528b70515" containerName="mariadb-account-create-update" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.892634 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="49c19b29-dfcc-4ff3-ac70-6c499f72c1bd" containerName="glance-httpd" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.893524 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.918210 5036 scope.go:117] "RemoveContainer" containerID="e827be3d9fe6d83b9268b07f4b381860a8d9075fbf1f5e731bab6a1e8eb6f3e0" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.918888 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.919652 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 10 17:16:09 crc kubenswrapper[5036]: E0110 17:16:09.926925 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e827be3d9fe6d83b9268b07f4b381860a8d9075fbf1f5e731bab6a1e8eb6f3e0\": container with ID starting with e827be3d9fe6d83b9268b07f4b381860a8d9075fbf1f5e731bab6a1e8eb6f3e0 not found: ID does not exist" containerID="e827be3d9fe6d83b9268b07f4b381860a8d9075fbf1f5e731bab6a1e8eb6f3e0" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.926971 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e827be3d9fe6d83b9268b07f4b381860a8d9075fbf1f5e731bab6a1e8eb6f3e0"} err="failed to get container status \"e827be3d9fe6d83b9268b07f4b381860a8d9075fbf1f5e731bab6a1e8eb6f3e0\": rpc error: code = NotFound desc = could not find container \"e827be3d9fe6d83b9268b07f4b381860a8d9075fbf1f5e731bab6a1e8eb6f3e0\": container with ID starting with e827be3d9fe6d83b9268b07f4b381860a8d9075fbf1f5e731bab6a1e8eb6f3e0 not found: ID does not exist" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.927010 5036 scope.go:117] "RemoveContainer" containerID="93e61743d52773d2c4ab92d10e57fe98f6ee9c169b9c2410ab6390ac36901f96" Jan 10 17:16:09 crc kubenswrapper[5036]: E0110 17:16:09.927627 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93e61743d52773d2c4ab92d10e57fe98f6ee9c169b9c2410ab6390ac36901f96\": container with ID starting with 93e61743d52773d2c4ab92d10e57fe98f6ee9c169b9c2410ab6390ac36901f96 not found: ID does not exist" containerID="93e61743d52773d2c4ab92d10e57fe98f6ee9c169b9c2410ab6390ac36901f96" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.927692 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93e61743d52773d2c4ab92d10e57fe98f6ee9c169b9c2410ab6390ac36901f96"} err="failed to get container status \"93e61743d52773d2c4ab92d10e57fe98f6ee9c169b9c2410ab6390ac36901f96\": rpc error: code = NotFound desc = could not find container \"93e61743d52773d2c4ab92d10e57fe98f6ee9c169b9c2410ab6390ac36901f96\": container with ID starting with 93e61743d52773d2c4ab92d10e57fe98f6ee9c169b9c2410ab6390ac36901f96 not found: ID does not exist" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.948664 5036 scope.go:117] "RemoveContainer" containerID="e827be3d9fe6d83b9268b07f4b381860a8d9075fbf1f5e731bab6a1e8eb6f3e0" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.955228 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e827be3d9fe6d83b9268b07f4b381860a8d9075fbf1f5e731bab6a1e8eb6f3e0"} err="failed to get container status \"e827be3d9fe6d83b9268b07f4b381860a8d9075fbf1f5e731bab6a1e8eb6f3e0\": rpc error: code = NotFound desc = could not find container \"e827be3d9fe6d83b9268b07f4b381860a8d9075fbf1f5e731bab6a1e8eb6f3e0\": container with ID starting with e827be3d9fe6d83b9268b07f4b381860a8d9075fbf1f5e731bab6a1e8eb6f3e0 not found: ID does not exist" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.955280 5036 scope.go:117] "RemoveContainer" containerID="93e61743d52773d2c4ab92d10e57fe98f6ee9c169b9c2410ab6390ac36901f96" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.958535 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93e61743d52773d2c4ab92d10e57fe98f6ee9c169b9c2410ab6390ac36901f96"} err="failed to get container status \"93e61743d52773d2c4ab92d10e57fe98f6ee9c169b9c2410ab6390ac36901f96\": rpc error: code = NotFound desc = could not find container \"93e61743d52773d2c4ab92d10e57fe98f6ee9c169b9c2410ab6390ac36901f96\": container with ID starting with 93e61743d52773d2c4ab92d10e57fe98f6ee9c169b9c2410ab6390ac36901f96 not found: ID does not exist" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.958603 5036 scope.go:117] "RemoveContainer" containerID="c7120baa7230c852d25345292e4119c7d481e3a55fbe139650aea1e1424f390c" Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.966747 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 10 17:16:09 crc kubenswrapper[5036]: I0110 17:16:09.993418 5036 scope.go:117] "RemoveContainer" containerID="64eb659e8d606bf5f5eedb6c0fa725f1ff1a14ac19df9c349ad00a934fcfb721" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.021772 5036 scope.go:117] "RemoveContainer" containerID="c7120baa7230c852d25345292e4119c7d481e3a55fbe139650aea1e1424f390c" Jan 10 17:16:10 crc kubenswrapper[5036]: E0110 17:16:10.022406 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7120baa7230c852d25345292e4119c7d481e3a55fbe139650aea1e1424f390c\": container with ID starting with c7120baa7230c852d25345292e4119c7d481e3a55fbe139650aea1e1424f390c not found: ID does not exist" containerID="c7120baa7230c852d25345292e4119c7d481e3a55fbe139650aea1e1424f390c" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.022436 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7120baa7230c852d25345292e4119c7d481e3a55fbe139650aea1e1424f390c"} err="failed to get container status \"c7120baa7230c852d25345292e4119c7d481e3a55fbe139650aea1e1424f390c\": rpc error: code = NotFound desc = could not find container \"c7120baa7230c852d25345292e4119c7d481e3a55fbe139650aea1e1424f390c\": container with ID starting with c7120baa7230c852d25345292e4119c7d481e3a55fbe139650aea1e1424f390c not found: ID does not exist" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.022457 5036 scope.go:117] "RemoveContainer" containerID="64eb659e8d606bf5f5eedb6c0fa725f1ff1a14ac19df9c349ad00a934fcfb721" Jan 10 17:16:10 crc kubenswrapper[5036]: E0110 17:16:10.022955 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64eb659e8d606bf5f5eedb6c0fa725f1ff1a14ac19df9c349ad00a934fcfb721\": container with ID starting with 64eb659e8d606bf5f5eedb6c0fa725f1ff1a14ac19df9c349ad00a934fcfb721 not found: ID does not exist" containerID="64eb659e8d606bf5f5eedb6c0fa725f1ff1a14ac19df9c349ad00a934fcfb721" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.022976 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64eb659e8d606bf5f5eedb6c0fa725f1ff1a14ac19df9c349ad00a934fcfb721"} err="failed to get container status \"64eb659e8d606bf5f5eedb6c0fa725f1ff1a14ac19df9c349ad00a934fcfb721\": rpc error: code = NotFound desc = could not find container \"64eb659e8d606bf5f5eedb6c0fa725f1ff1a14ac19df9c349ad00a934fcfb721\": container with ID starting with 64eb659e8d606bf5f5eedb6c0fa725f1ff1a14ac19df9c349ad00a934fcfb721 not found: ID does not exist" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.022990 5036 scope.go:117] "RemoveContainer" containerID="c7120baa7230c852d25345292e4119c7d481e3a55fbe139650aea1e1424f390c" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.023323 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7120baa7230c852d25345292e4119c7d481e3a55fbe139650aea1e1424f390c"} err="failed to get container status \"c7120baa7230c852d25345292e4119c7d481e3a55fbe139650aea1e1424f390c\": rpc error: code = NotFound desc = could not find container \"c7120baa7230c852d25345292e4119c7d481e3a55fbe139650aea1e1424f390c\": container with ID starting with c7120baa7230c852d25345292e4119c7d481e3a55fbe139650aea1e1424f390c not found: ID does not exist" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.023342 5036 scope.go:117] "RemoveContainer" containerID="64eb659e8d606bf5f5eedb6c0fa725f1ff1a14ac19df9c349ad00a934fcfb721" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.023663 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64eb659e8d606bf5f5eedb6c0fa725f1ff1a14ac19df9c349ad00a934fcfb721"} err="failed to get container status \"64eb659e8d606bf5f5eedb6c0fa725f1ff1a14ac19df9c349ad00a934fcfb721\": rpc error: code = NotFound desc = could not find container \"64eb659e8d606bf5f5eedb6c0fa725f1ff1a14ac19df9c349ad00a934fcfb721\": container with ID starting with 64eb659e8d606bf5f5eedb6c0fa725f1ff1a14ac19df9c349ad00a934fcfb721 not found: ID does not exist" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.033307 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/46236d51-28af-48ad-8aff-2300b9d0155f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.033359 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.033385 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46236d51-28af-48ad-8aff-2300b9d0155f-logs\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.033428 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46236d51-28af-48ad-8aff-2300b9d0155f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.033468 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfqb8\" (UniqueName: \"kubernetes.io/projected/46236d51-28af-48ad-8aff-2300b9d0155f-kube-api-access-nfqb8\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.033491 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46236d51-28af-48ad-8aff-2300b9d0155f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.033513 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46236d51-28af-48ad-8aff-2300b9d0155f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.033540 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46236d51-28af-48ad-8aff-2300b9d0155f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.033799 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46236d51-28af-48ad-8aff-2300b9d0155f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.081539 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.090860 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.113894 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.121731 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.125451 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.125716 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.135385 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46236d51-28af-48ad-8aff-2300b9d0155f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.135456 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfqb8\" (UniqueName: \"kubernetes.io/projected/46236d51-28af-48ad-8aff-2300b9d0155f-kube-api-access-nfqb8\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.135487 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46236d51-28af-48ad-8aff-2300b9d0155f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.135518 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46236d51-28af-48ad-8aff-2300b9d0155f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.135556 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46236d51-28af-48ad-8aff-2300b9d0155f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.135634 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46236d51-28af-48ad-8aff-2300b9d0155f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.135668 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/46236d51-28af-48ad-8aff-2300b9d0155f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.135690 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.135725 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46236d51-28af-48ad-8aff-2300b9d0155f-logs\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.136184 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46236d51-28af-48ad-8aff-2300b9d0155f-logs\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.136654 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46236d51-28af-48ad-8aff-2300b9d0155f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.142337 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46236d51-28af-48ad-8aff-2300b9d0155f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.145720 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/46236d51-28af-48ad-8aff-2300b9d0155f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.148390 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46236d51-28af-48ad-8aff-2300b9d0155f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.153591 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46236d51-28af-48ad-8aff-2300b9d0155f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.155278 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/46236d51-28af-48ad-8aff-2300b9d0155f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.156249 5036 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.163137 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfqb8\" (UniqueName: \"kubernetes.io/projected/46236d51-28af-48ad-8aff-2300b9d0155f-kube-api-access-nfqb8\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.190064 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.208150 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"46236d51-28af-48ad-8aff-2300b9d0155f\") " pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.237087 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31000160-d620-481e-8b44-98f23e2e0679-logs\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.237199 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31000160-d620-481e-8b44-98f23e2e0679-scripts\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.237232 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31000160-d620-481e-8b44-98f23e2e0679-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.237273 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/31000160-d620-481e-8b44-98f23e2e0679-ceph\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.237333 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj5vz\" (UniqueName: \"kubernetes.io/projected/31000160-d620-481e-8b44-98f23e2e0679-kube-api-access-mj5vz\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.237364 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31000160-d620-481e-8b44-98f23e2e0679-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.237390 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31000160-d620-481e-8b44-98f23e2e0679-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.237435 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.237464 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31000160-d620-481e-8b44-98f23e2e0679-config-data\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.240831 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.338932 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31000160-d620-481e-8b44-98f23e2e0679-logs\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.338986 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31000160-d620-481e-8b44-98f23e2e0679-scripts\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.339020 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31000160-d620-481e-8b44-98f23e2e0679-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.339069 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/31000160-d620-481e-8b44-98f23e2e0679-ceph\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.339126 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj5vz\" (UniqueName: \"kubernetes.io/projected/31000160-d620-481e-8b44-98f23e2e0679-kube-api-access-mj5vz\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.339156 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31000160-d620-481e-8b44-98f23e2e0679-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.339183 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31000160-d620-481e-8b44-98f23e2e0679-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.339222 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.339244 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31000160-d620-481e-8b44-98f23e2e0679-config-data\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.343579 5036 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.343583 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31000160-d620-481e-8b44-98f23e2e0679-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.344281 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31000160-d620-481e-8b44-98f23e2e0679-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.344816 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/31000160-d620-481e-8b44-98f23e2e0679-ceph\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.345911 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31000160-d620-481e-8b44-98f23e2e0679-logs\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.346469 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31000160-d620-481e-8b44-98f23e2e0679-config-data\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.351505 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31000160-d620-481e-8b44-98f23e2e0679-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.355875 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31000160-d620-481e-8b44-98f23e2e0679-scripts\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.361280 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj5vz\" (UniqueName: \"kubernetes.io/projected/31000160-d620-481e-8b44-98f23e2e0679-kube-api-access-mj5vz\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.410237 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"31000160-d620-481e-8b44-98f23e2e0679\") " pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.459203 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.533418 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="146829fe-d847-4236-9015-fdaaea944887" path="/var/lib/kubelet/pods/146829fe-d847-4236-9015-fdaaea944887/volumes" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.534288 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c19b29-dfcc-4ff3-ac70-6c499f72c1bd" path="/var/lib/kubelet/pods/49c19b29-dfcc-4ff3-ac70-6c499f72c1bd/volumes" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.626937 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-lk5kh"] Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.628579 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-lk5kh" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.630366 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-hj558" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.631806 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.650992 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-lk5kh"] Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.747889 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d964e6-ac20-4cac-ad16-6461bc88fac7-combined-ca-bundle\") pod \"manila-db-sync-lk5kh\" (UID: \"e9d964e6-ac20-4cac-ad16-6461bc88fac7\") " pod="openstack/manila-db-sync-lk5kh" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.747984 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8xxb\" (UniqueName: \"kubernetes.io/projected/e9d964e6-ac20-4cac-ad16-6461bc88fac7-kube-api-access-w8xxb\") pod \"manila-db-sync-lk5kh\" (UID: \"e9d964e6-ac20-4cac-ad16-6461bc88fac7\") " pod="openstack/manila-db-sync-lk5kh" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.748029 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/e9d964e6-ac20-4cac-ad16-6461bc88fac7-job-config-data\") pod \"manila-db-sync-lk5kh\" (UID: \"e9d964e6-ac20-4cac-ad16-6461bc88fac7\") " pod="openstack/manila-db-sync-lk5kh" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.748190 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9d964e6-ac20-4cac-ad16-6461bc88fac7-config-data\") pod \"manila-db-sync-lk5kh\" (UID: \"e9d964e6-ac20-4cac-ad16-6461bc88fac7\") " pod="openstack/manila-db-sync-lk5kh" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.850084 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9d964e6-ac20-4cac-ad16-6461bc88fac7-config-data\") pod \"manila-db-sync-lk5kh\" (UID: \"e9d964e6-ac20-4cac-ad16-6461bc88fac7\") " pod="openstack/manila-db-sync-lk5kh" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.850198 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d964e6-ac20-4cac-ad16-6461bc88fac7-combined-ca-bundle\") pod \"manila-db-sync-lk5kh\" (UID: \"e9d964e6-ac20-4cac-ad16-6461bc88fac7\") " pod="openstack/manila-db-sync-lk5kh" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.850234 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8xxb\" (UniqueName: \"kubernetes.io/projected/e9d964e6-ac20-4cac-ad16-6461bc88fac7-kube-api-access-w8xxb\") pod \"manila-db-sync-lk5kh\" (UID: \"e9d964e6-ac20-4cac-ad16-6461bc88fac7\") " pod="openstack/manila-db-sync-lk5kh" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.850257 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/e9d964e6-ac20-4cac-ad16-6461bc88fac7-job-config-data\") pod \"manila-db-sync-lk5kh\" (UID: \"e9d964e6-ac20-4cac-ad16-6461bc88fac7\") " pod="openstack/manila-db-sync-lk5kh" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.857678 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9d964e6-ac20-4cac-ad16-6461bc88fac7-config-data\") pod \"manila-db-sync-lk5kh\" (UID: \"e9d964e6-ac20-4cac-ad16-6461bc88fac7\") " pod="openstack/manila-db-sync-lk5kh" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.868537 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/e9d964e6-ac20-4cac-ad16-6461bc88fac7-job-config-data\") pod \"manila-db-sync-lk5kh\" (UID: \"e9d964e6-ac20-4cac-ad16-6461bc88fac7\") " pod="openstack/manila-db-sync-lk5kh" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.870458 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d964e6-ac20-4cac-ad16-6461bc88fac7-combined-ca-bundle\") pod \"manila-db-sync-lk5kh\" (UID: \"e9d964e6-ac20-4cac-ad16-6461bc88fac7\") " pod="openstack/manila-db-sync-lk5kh" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.873178 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 10 17:16:10 crc kubenswrapper[5036]: W0110 17:16:10.874802 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46236d51_28af_48ad_8aff_2300b9d0155f.slice/crio-975058f30720e760430b5a0e83686edbed77ed4b092b1dad16bcb1daf6153419 WatchSource:0}: Error finding container 975058f30720e760430b5a0e83686edbed77ed4b092b1dad16bcb1daf6153419: Status 404 returned error can't find the container with id 975058f30720e760430b5a0e83686edbed77ed4b092b1dad16bcb1daf6153419 Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.875688 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8xxb\" (UniqueName: \"kubernetes.io/projected/e9d964e6-ac20-4cac-ad16-6461bc88fac7-kube-api-access-w8xxb\") pod \"manila-db-sync-lk5kh\" (UID: \"e9d964e6-ac20-4cac-ad16-6461bc88fac7\") " pod="openstack/manila-db-sync-lk5kh" Jan 10 17:16:10 crc kubenswrapper[5036]: I0110 17:16:10.983102 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-lk5kh" Jan 10 17:16:11 crc kubenswrapper[5036]: I0110 17:16:11.029590 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 10 17:16:11 crc kubenswrapper[5036]: W0110 17:16:11.041294 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31000160_d620_481e_8b44_98f23e2e0679.slice/crio-90e496cd307eabdc5b9dc93431e249245179c812bdc5d3e05e6cc95fffd65742 WatchSource:0}: Error finding container 90e496cd307eabdc5b9dc93431e249245179c812bdc5d3e05e6cc95fffd65742: Status 404 returned error can't find the container with id 90e496cd307eabdc5b9dc93431e249245179c812bdc5d3e05e6cc95fffd65742 Jan 10 17:16:11 crc kubenswrapper[5036]: I0110 17:16:11.455391 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-lk5kh"] Jan 10 17:16:11 crc kubenswrapper[5036]: W0110 17:16:11.458190 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9d964e6_ac20_4cac_ad16_6461bc88fac7.slice/crio-4712d0c227e5d491a3cf3515c0c57e405667c0e6715e3732757cb0d813625f1a WatchSource:0}: Error finding container 4712d0c227e5d491a3cf3515c0c57e405667c0e6715e3732757cb0d813625f1a: Status 404 returned error can't find the container with id 4712d0c227e5d491a3cf3515c0c57e405667c0e6715e3732757cb0d813625f1a Jan 10 17:16:11 crc kubenswrapper[5036]: I0110 17:16:11.789058 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-lk5kh" event={"ID":"e9d964e6-ac20-4cac-ad16-6461bc88fac7","Type":"ContainerStarted","Data":"4712d0c227e5d491a3cf3515c0c57e405667c0e6715e3732757cb0d813625f1a"} Jan 10 17:16:11 crc kubenswrapper[5036]: I0110 17:16:11.792190 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31000160-d620-481e-8b44-98f23e2e0679","Type":"ContainerStarted","Data":"90e496cd307eabdc5b9dc93431e249245179c812bdc5d3e05e6cc95fffd65742"} Jan 10 17:16:11 crc kubenswrapper[5036]: I0110 17:16:11.794788 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"46236d51-28af-48ad-8aff-2300b9d0155f","Type":"ContainerStarted","Data":"16bcd32d246e59b53d9068c9d117d8cec4f0889c583ef971c573f16e12c64bac"} Jan 10 17:16:11 crc kubenswrapper[5036]: I0110 17:16:11.794817 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"46236d51-28af-48ad-8aff-2300b9d0155f","Type":"ContainerStarted","Data":"975058f30720e760430b5a0e83686edbed77ed4b092b1dad16bcb1daf6153419"} Jan 10 17:16:12 crc kubenswrapper[5036]: I0110 17:16:12.818893 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"46236d51-28af-48ad-8aff-2300b9d0155f","Type":"ContainerStarted","Data":"ec326ab8d95f711e2bdc256df40487936a01116f0838bc8248ba7b91e471bfc9"} Jan 10 17:16:12 crc kubenswrapper[5036]: I0110 17:16:12.823328 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31000160-d620-481e-8b44-98f23e2e0679","Type":"ContainerStarted","Data":"047f4b7497d8d60bf1a11291e1abb871bc4506baf143dc6aeb3db2155c402802"} Jan 10 17:16:12 crc kubenswrapper[5036]: I0110 17:16:12.823371 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"31000160-d620-481e-8b44-98f23e2e0679","Type":"ContainerStarted","Data":"e827d98fc947e11d9f78e7d681be6c3d85aafd31f214df4da81dec028f99bce8"} Jan 10 17:16:12 crc kubenswrapper[5036]: I0110 17:16:12.845623 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.845606128 podStartE2EDuration="3.845606128s" podCreationTimestamp="2026-01-10 17:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 17:16:12.834310125 +0000 UTC m=+2894.704545619" watchObservedRunningTime="2026-01-10 17:16:12.845606128 +0000 UTC m=+2894.715841622" Jan 10 17:16:12 crc kubenswrapper[5036]: I0110 17:16:12.860317 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=2.860299527 podStartE2EDuration="2.860299527s" podCreationTimestamp="2026-01-10 17:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 17:16:12.857638682 +0000 UTC m=+2894.727874166" watchObservedRunningTime="2026-01-10 17:16:12.860299527 +0000 UTC m=+2894.730535011" Jan 10 17:16:14 crc kubenswrapper[5036]: I0110 17:16:14.906412 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Jan 10 17:16:15 crc kubenswrapper[5036]: I0110 17:16:15.065167 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Jan 10 17:16:20 crc kubenswrapper[5036]: I0110 17:16:20.241863 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 10 17:16:20 crc kubenswrapper[5036]: I0110 17:16:20.243200 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 10 17:16:20 crc kubenswrapper[5036]: I0110 17:16:20.341066 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 10 17:16:20 crc kubenswrapper[5036]: I0110 17:16:20.369067 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 10 17:16:20 crc kubenswrapper[5036]: I0110 17:16:20.460194 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 10 17:16:20 crc kubenswrapper[5036]: I0110 17:16:20.460267 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 10 17:16:20 crc kubenswrapper[5036]: I0110 17:16:20.501434 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 10 17:16:20 crc kubenswrapper[5036]: I0110 17:16:20.526484 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 10 17:16:20 crc kubenswrapper[5036]: I0110 17:16:20.900932 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77c6f556bf-gmpft" event={"ID":"dd7a2aaf-b591-4388-aaf0-f94c930032b5","Type":"ContainerStarted","Data":"899b15ac73aa82209ed544a962d2d1353aa937a6496dffec4bef0aaa7ac9dab4"} Jan 10 17:16:20 crc kubenswrapper[5036]: I0110 17:16:20.904397 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79f74f6ffb-kzjrv" event={"ID":"b7d588d2-de3c-4aa8-9949-cd2cc17beac6","Type":"ContainerStarted","Data":"c47503d9d239a27615fac186d4db643215e81fe8554510b4c3096096c664f607"} Jan 10 17:16:20 crc kubenswrapper[5036]: I0110 17:16:20.907647 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bcc8455c4-njd4j" event={"ID":"e92a2ceb-4619-4207-a2a3-b6c588674ab8","Type":"ContainerStarted","Data":"4644af0da866cd4c827d9b2e81caff0f4d59a00fb871f275d3dcb24006d5750e"} Jan 10 17:16:20 crc kubenswrapper[5036]: I0110 17:16:20.910882 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78fc99bfc7-mctlg" event={"ID":"d4911ee6-6ecd-40da-a12b-f6a79cdaa201","Type":"ContainerStarted","Data":"088b6f96a798183b597c6041d9cff8936c5b19f8ab1c3680df9108d8aca41ab0"} Jan 10 17:16:20 crc kubenswrapper[5036]: I0110 17:16:20.911730 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 10 17:16:20 crc kubenswrapper[5036]: I0110 17:16:20.911869 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 10 17:16:20 crc kubenswrapper[5036]: I0110 17:16:20.911882 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 10 17:16:20 crc kubenswrapper[5036]: I0110 17:16:20.911892 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 10 17:16:21 crc kubenswrapper[5036]: I0110 17:16:21.922117 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79f74f6ffb-kzjrv" event={"ID":"b7d588d2-de3c-4aa8-9949-cd2cc17beac6","Type":"ContainerStarted","Data":"33f3578f36c2eac18c7decc55a04cf76ce07f0a11f1219e396288278723db621"} Jan 10 17:16:21 crc kubenswrapper[5036]: I0110 17:16:21.928100 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5bcc8455c4-njd4j" event={"ID":"e92a2ceb-4619-4207-a2a3-b6c588674ab8","Type":"ContainerStarted","Data":"42c2fd3f9b8d809450091246cc22e667babcd1e27788cc15c0f6d7d1eab83396"} Jan 10 17:16:21 crc kubenswrapper[5036]: I0110 17:16:21.930753 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78fc99bfc7-mctlg" event={"ID":"d4911ee6-6ecd-40da-a12b-f6a79cdaa201","Type":"ContainerStarted","Data":"73db86f898d4868807c8cb5337217949e7654dcf3bbcb3521f1ffef7a31582d6"} Jan 10 17:16:21 crc kubenswrapper[5036]: I0110 17:16:21.930919 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-78fc99bfc7-mctlg" podUID="d4911ee6-6ecd-40da-a12b-f6a79cdaa201" containerName="horizon-log" containerID="cri-o://088b6f96a798183b597c6041d9cff8936c5b19f8ab1c3680df9108d8aca41ab0" gracePeriod=30 Jan 10 17:16:21 crc kubenswrapper[5036]: I0110 17:16:21.931238 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-78fc99bfc7-mctlg" podUID="d4911ee6-6ecd-40da-a12b-f6a79cdaa201" containerName="horizon" containerID="cri-o://73db86f898d4868807c8cb5337217949e7654dcf3bbcb3521f1ffef7a31582d6" gracePeriod=30 Jan 10 17:16:21 crc kubenswrapper[5036]: I0110 17:16:21.935906 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-lk5kh" event={"ID":"e9d964e6-ac20-4cac-ad16-6461bc88fac7","Type":"ContainerStarted","Data":"acec2eef33e20f5517bc0a09686c6e1d11091dc40b735e07d9bfa39c8e63a6d9"} Jan 10 17:16:21 crc kubenswrapper[5036]: I0110 17:16:21.950439 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-77c6f556bf-gmpft" podUID="dd7a2aaf-b591-4388-aaf0-f94c930032b5" containerName="horizon-log" containerID="cri-o://899b15ac73aa82209ed544a962d2d1353aa937a6496dffec4bef0aaa7ac9dab4" gracePeriod=30 Jan 10 17:16:21 crc kubenswrapper[5036]: I0110 17:16:21.951503 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77c6f556bf-gmpft" event={"ID":"dd7a2aaf-b591-4388-aaf0-f94c930032b5","Type":"ContainerStarted","Data":"067a0e7f5115e9db1753408dad5f266e433a2695546cabe6cd61427335199b49"} Jan 10 17:16:21 crc kubenswrapper[5036]: I0110 17:16:21.952130 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-77c6f556bf-gmpft" podUID="dd7a2aaf-b591-4388-aaf0-f94c930032b5" containerName="horizon" containerID="cri-o://067a0e7f5115e9db1753408dad5f266e433a2695546cabe6cd61427335199b49" gracePeriod=30 Jan 10 17:16:21 crc kubenswrapper[5036]: I0110 17:16:21.980942 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-79f74f6ffb-kzjrv" podStartSLOduration=2.997881797 podStartE2EDuration="14.980916448s" podCreationTimestamp="2026-01-10 17:16:07 +0000 UTC" firstStartedPulling="2026-01-10 17:16:08.463051168 +0000 UTC m=+2890.333286662" lastFinishedPulling="2026-01-10 17:16:20.446085819 +0000 UTC m=+2902.316321313" observedRunningTime="2026-01-10 17:16:21.953230478 +0000 UTC m=+2903.823465982" watchObservedRunningTime="2026-01-10 17:16:21.980916448 +0000 UTC m=+2903.851151942" Jan 10 17:16:21 crc kubenswrapper[5036]: I0110 17:16:21.981331 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-78fc99bfc7-mctlg" podStartSLOduration=2.572857898 podStartE2EDuration="16.98132602s" podCreationTimestamp="2026-01-10 17:16:05 +0000 UTC" firstStartedPulling="2026-01-10 17:16:06.019718406 +0000 UTC m=+2887.889953900" lastFinishedPulling="2026-01-10 17:16:20.428186528 +0000 UTC m=+2902.298422022" observedRunningTime="2026-01-10 17:16:21.973154257 +0000 UTC m=+2903.843389751" watchObservedRunningTime="2026-01-10 17:16:21.98132602 +0000 UTC m=+2903.851561514" Jan 10 17:16:22 crc kubenswrapper[5036]: I0110 17:16:22.000245 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-77c6f556bf-gmpft" podStartSLOduration=2.8203934349999997 podStartE2EDuration="17.000226279s" podCreationTimestamp="2026-01-10 17:16:05 +0000 UTC" firstStartedPulling="2026-01-10 17:16:06.238808812 +0000 UTC m=+2888.109044306" lastFinishedPulling="2026-01-10 17:16:20.418641656 +0000 UTC m=+2902.288877150" observedRunningTime="2026-01-10 17:16:21.991547902 +0000 UTC m=+2903.861783396" watchObservedRunningTime="2026-01-10 17:16:22.000226279 +0000 UTC m=+2903.870461773" Jan 10 17:16:22 crc kubenswrapper[5036]: I0110 17:16:22.032384 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-lk5kh" podStartSLOduration=3.074311898 podStartE2EDuration="12.032362207s" podCreationTimestamp="2026-01-10 17:16:10 +0000 UTC" firstStartedPulling="2026-01-10 17:16:11.46142259 +0000 UTC m=+2893.331658084" lastFinishedPulling="2026-01-10 17:16:20.419472899 +0000 UTC m=+2902.289708393" observedRunningTime="2026-01-10 17:16:22.010154873 +0000 UTC m=+2903.880390367" watchObservedRunningTime="2026-01-10 17:16:22.032362207 +0000 UTC m=+2903.902597701" Jan 10 17:16:23 crc kubenswrapper[5036]: I0110 17:16:23.786354 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 10 17:16:23 crc kubenswrapper[5036]: I0110 17:16:23.786535 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 10 17:16:23 crc kubenswrapper[5036]: I0110 17:16:23.786713 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 10 17:16:23 crc kubenswrapper[5036]: I0110 17:16:23.786806 5036 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 10 17:16:23 crc kubenswrapper[5036]: I0110 17:16:23.796550 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 10 17:16:23 crc kubenswrapper[5036]: I0110 17:16:23.869976 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5bcc8455c4-njd4j" podStartSLOduration=5.177315674 podStartE2EDuration="16.869951502s" podCreationTimestamp="2026-01-10 17:16:07 +0000 UTC" firstStartedPulling="2026-01-10 17:16:08.723364772 +0000 UTC m=+2890.593600266" lastFinishedPulling="2026-01-10 17:16:20.4160006 +0000 UTC m=+2902.286236094" observedRunningTime="2026-01-10 17:16:22.040306574 +0000 UTC m=+2903.910542068" watchObservedRunningTime="2026-01-10 17:16:23.869951502 +0000 UTC m=+2905.740187006" Jan 10 17:16:25 crc kubenswrapper[5036]: I0110 17:16:25.404040 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-78fc99bfc7-mctlg" Jan 10 17:16:25 crc kubenswrapper[5036]: I0110 17:16:25.588759 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-77c6f556bf-gmpft" Jan 10 17:16:26 crc kubenswrapper[5036]: I0110 17:16:26.024047 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 17:16:26 crc kubenswrapper[5036]: I0110 17:16:26.024101 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 17:16:26 crc kubenswrapper[5036]: I0110 17:16:26.024141 5036 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 17:16:26 crc kubenswrapper[5036]: I0110 17:16:26.042752 5036 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e1e211d00f0a3d2cccd996d6fd957c8fef52f7908e7b7faa418a6b65ea4298f3"} pod="openshift-machine-config-operator/machine-config-daemon-kqphb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 10 17:16:26 crc kubenswrapper[5036]: I0110 17:16:26.042833 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" containerID="cri-o://e1e211d00f0a3d2cccd996d6fd957c8fef52f7908e7b7faa418a6b65ea4298f3" gracePeriod=600 Jan 10 17:16:27 crc kubenswrapper[5036]: I0110 17:16:27.201604 5036 generic.go:334] "Generic (PLEG): container finished" podID="79756361-741e-4470-831b-6ee092bc6277" containerID="e1e211d00f0a3d2cccd996d6fd957c8fef52f7908e7b7faa418a6b65ea4298f3" exitCode=0 Jan 10 17:16:27 crc kubenswrapper[5036]: I0110 17:16:27.202294 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerDied","Data":"e1e211d00f0a3d2cccd996d6fd957c8fef52f7908e7b7faa418a6b65ea4298f3"} Jan 10 17:16:27 crc kubenswrapper[5036]: I0110 17:16:27.202351 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerStarted","Data":"7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4"} Jan 10 17:16:27 crc kubenswrapper[5036]: I0110 17:16:27.202385 5036 scope.go:117] "RemoveContainer" containerID="d34c698ed639e3ac3fd01efd36ceb11234b6aabf6fe4cee4ab346e27585727af" Jan 10 17:16:27 crc kubenswrapper[5036]: I0110 17:16:27.797621 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:27 crc kubenswrapper[5036]: I0110 17:16:27.797932 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:27 crc kubenswrapper[5036]: I0110 17:16:27.866015 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:27 crc kubenswrapper[5036]: I0110 17:16:27.866145 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:37 crc kubenswrapper[5036]: I0110 17:16:37.307296 5036 generic.go:334] "Generic (PLEG): container finished" podID="e9d964e6-ac20-4cac-ad16-6461bc88fac7" containerID="acec2eef33e20f5517bc0a09686c6e1d11091dc40b735e07d9bfa39c8e63a6d9" exitCode=0 Jan 10 17:16:37 crc kubenswrapper[5036]: I0110 17:16:37.307386 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-lk5kh" event={"ID":"e9d964e6-ac20-4cac-ad16-6461bc88fac7","Type":"ContainerDied","Data":"acec2eef33e20f5517bc0a09686c6e1d11091dc40b735e07d9bfa39c8e63a6d9"} Jan 10 17:16:37 crc kubenswrapper[5036]: I0110 17:16:37.798376 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-79f74f6ffb-kzjrv" podUID="b7d588d2-de3c-4aa8-9949-cd2cc17beac6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.242:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.242:8443: connect: connection refused" Jan 10 17:16:37 crc kubenswrapper[5036]: I0110 17:16:37.868502 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5bcc8455c4-njd4j" podUID="e92a2ceb-4619-4207-a2a3-b6c588674ab8" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.243:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.243:8443: connect: connection refused" Jan 10 17:16:38 crc kubenswrapper[5036]: I0110 17:16:38.748609 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-lk5kh" Jan 10 17:16:38 crc kubenswrapper[5036]: I0110 17:16:38.882394 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8xxb\" (UniqueName: \"kubernetes.io/projected/e9d964e6-ac20-4cac-ad16-6461bc88fac7-kube-api-access-w8xxb\") pod \"e9d964e6-ac20-4cac-ad16-6461bc88fac7\" (UID: \"e9d964e6-ac20-4cac-ad16-6461bc88fac7\") " Jan 10 17:16:38 crc kubenswrapper[5036]: I0110 17:16:38.882473 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d964e6-ac20-4cac-ad16-6461bc88fac7-combined-ca-bundle\") pod \"e9d964e6-ac20-4cac-ad16-6461bc88fac7\" (UID: \"e9d964e6-ac20-4cac-ad16-6461bc88fac7\") " Jan 10 17:16:38 crc kubenswrapper[5036]: I0110 17:16:38.882783 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9d964e6-ac20-4cac-ad16-6461bc88fac7-config-data\") pod \"e9d964e6-ac20-4cac-ad16-6461bc88fac7\" (UID: \"e9d964e6-ac20-4cac-ad16-6461bc88fac7\") " Jan 10 17:16:38 crc kubenswrapper[5036]: I0110 17:16:38.883466 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/e9d964e6-ac20-4cac-ad16-6461bc88fac7-job-config-data\") pod \"e9d964e6-ac20-4cac-ad16-6461bc88fac7\" (UID: \"e9d964e6-ac20-4cac-ad16-6461bc88fac7\") " Jan 10 17:16:38 crc kubenswrapper[5036]: I0110 17:16:38.909264 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9d964e6-ac20-4cac-ad16-6461bc88fac7-kube-api-access-w8xxb" (OuterVolumeSpecName: "kube-api-access-w8xxb") pod "e9d964e6-ac20-4cac-ad16-6461bc88fac7" (UID: "e9d964e6-ac20-4cac-ad16-6461bc88fac7"). InnerVolumeSpecName "kube-api-access-w8xxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:16:38 crc kubenswrapper[5036]: I0110 17:16:38.909363 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9d964e6-ac20-4cac-ad16-6461bc88fac7-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "e9d964e6-ac20-4cac-ad16-6461bc88fac7" (UID: "e9d964e6-ac20-4cac-ad16-6461bc88fac7"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:16:38 crc kubenswrapper[5036]: I0110 17:16:38.924804 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9d964e6-ac20-4cac-ad16-6461bc88fac7-config-data" (OuterVolumeSpecName: "config-data") pod "e9d964e6-ac20-4cac-ad16-6461bc88fac7" (UID: "e9d964e6-ac20-4cac-ad16-6461bc88fac7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:16:38 crc kubenswrapper[5036]: I0110 17:16:38.927254 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9d964e6-ac20-4cac-ad16-6461bc88fac7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9d964e6-ac20-4cac-ad16-6461bc88fac7" (UID: "e9d964e6-ac20-4cac-ad16-6461bc88fac7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:16:38 crc kubenswrapper[5036]: I0110 17:16:38.986303 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9d964e6-ac20-4cac-ad16-6461bc88fac7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:38 crc kubenswrapper[5036]: I0110 17:16:38.986330 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9d964e6-ac20-4cac-ad16-6461bc88fac7-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:38 crc kubenswrapper[5036]: I0110 17:16:38.986354 5036 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/e9d964e6-ac20-4cac-ad16-6461bc88fac7-job-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:38 crc kubenswrapper[5036]: I0110 17:16:38.986364 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8xxb\" (UniqueName: \"kubernetes.io/projected/e9d964e6-ac20-4cac-ad16-6461bc88fac7-kube-api-access-w8xxb\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:39 crc kubenswrapper[5036]: I0110 17:16:39.324734 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-lk5kh" event={"ID":"e9d964e6-ac20-4cac-ad16-6461bc88fac7","Type":"ContainerDied","Data":"4712d0c227e5d491a3cf3515c0c57e405667c0e6715e3732757cb0d813625f1a"} Jan 10 17:16:39 crc kubenswrapper[5036]: I0110 17:16:39.324778 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4712d0c227e5d491a3cf3515c0c57e405667c0e6715e3732757cb0d813625f1a" Jan 10 17:16:39 crc kubenswrapper[5036]: I0110 17:16:39.324786 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-lk5kh" Jan 10 17:16:39 crc kubenswrapper[5036]: I0110 17:16:39.774423 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Jan 10 17:16:39 crc kubenswrapper[5036]: E0110 17:16:39.775104 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9d964e6-ac20-4cac-ad16-6461bc88fac7" containerName="manila-db-sync" Jan 10 17:16:39 crc kubenswrapper[5036]: I0110 17:16:39.775117 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d964e6-ac20-4cac-ad16-6461bc88fac7" containerName="manila-db-sync" Jan 10 17:16:39 crc kubenswrapper[5036]: I0110 17:16:39.775897 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9d964e6-ac20-4cac-ad16-6461bc88fac7" containerName="manila-db-sync" Jan 10 17:16:39 crc kubenswrapper[5036]: I0110 17:16:39.808893 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 10 17:16:39 crc kubenswrapper[5036]: I0110 17:16:39.813132 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Jan 10 17:16:39 crc kubenswrapper[5036]: I0110 17:16:39.825983 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-hj558" Jan 10 17:16:39 crc kubenswrapper[5036]: I0110 17:16:39.826153 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Jan 10 17:16:39 crc kubenswrapper[5036]: I0110 17:16:39.826309 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Jan 10 17:16:39 crc kubenswrapper[5036]: I0110 17:16:39.826471 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Jan 10 17:16:39 crc kubenswrapper[5036]: I0110 17:16:39.875230 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Jan 10 17:16:39 crc kubenswrapper[5036]: I0110 17:16:39.876941 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 10 17:16:39 crc kubenswrapper[5036]: I0110 17:16:39.880572 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Jan 10 17:16:39 crc kubenswrapper[5036]: I0110 17:16:39.890156 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Jan 10 17:16:39 crc kubenswrapper[5036]: I0110 17:16:39.988416 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-dd8k9"] Jan 10 17:16:39 crc kubenswrapper[5036]: I0110 17:16:39.990381 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.013527 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc52704a-a1a3-4a9f-91d5-05035ea65015-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " pod="openstack/manila-share-share1-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.013603 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bc52704a-a1a3-4a9f-91d5-05035ea65015-ceph\") pod \"manila-share-share1-0\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " pod="openstack/manila-share-share1-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.013621 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bc52704a-a1a3-4a9f-91d5-05035ea65015-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " pod="openstack/manila-share-share1-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.013642 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7186e5b3-1cc5-422b-8151-4a873bf08a6a-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-dd8k9\" (UID: \"7186e5b3-1cc5-422b-8151-4a873bf08a6a\") " pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.013670 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c6edad84-3f9b-45db-bb29-1609bc82b62a-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"c6edad84-3f9b-45db-bb29-1609bc82b62a\") " pod="openstack/manila-scheduler-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.013702 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc52704a-a1a3-4a9f-91d5-05035ea65015-scripts\") pod \"manila-share-share1-0\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " pod="openstack/manila-share-share1-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.013750 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc52704a-a1a3-4a9f-91d5-05035ea65015-config-data\") pod \"manila-share-share1-0\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " pod="openstack/manila-share-share1-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.013773 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/bc52704a-a1a3-4a9f-91d5-05035ea65015-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " pod="openstack/manila-share-share1-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.013793 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7186e5b3-1cc5-422b-8151-4a873bf08a6a-config\") pod \"dnsmasq-dns-76b5fdb995-dd8k9\" (UID: \"7186e5b3-1cc5-422b-8151-4a873bf08a6a\") " pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.013825 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7m9z\" (UniqueName: \"kubernetes.io/projected/7186e5b3-1cc5-422b-8151-4a873bf08a6a-kube-api-access-c7m9z\") pod \"dnsmasq-dns-76b5fdb995-dd8k9\" (UID: \"7186e5b3-1cc5-422b-8151-4a873bf08a6a\") " pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.013895 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7186e5b3-1cc5-422b-8151-4a873bf08a6a-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-dd8k9\" (UID: \"7186e5b3-1cc5-422b-8151-4a873bf08a6a\") " pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.013961 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7186e5b3-1cc5-422b-8151-4a873bf08a6a-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-dd8k9\" (UID: \"7186e5b3-1cc5-422b-8151-4a873bf08a6a\") " pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.013982 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl9tg\" (UniqueName: \"kubernetes.io/projected/c6edad84-3f9b-45db-bb29-1609bc82b62a-kube-api-access-zl9tg\") pod \"manila-scheduler-0\" (UID: \"c6edad84-3f9b-45db-bb29-1609bc82b62a\") " pod="openstack/manila-scheduler-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.014001 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6edad84-3f9b-45db-bb29-1609bc82b62a-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"c6edad84-3f9b-45db-bb29-1609bc82b62a\") " pod="openstack/manila-scheduler-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.014018 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6edad84-3f9b-45db-bb29-1609bc82b62a-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"c6edad84-3f9b-45db-bb29-1609bc82b62a\") " pod="openstack/manila-scheduler-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.014049 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6edad84-3f9b-45db-bb29-1609bc82b62a-config-data\") pod \"manila-scheduler-0\" (UID: \"c6edad84-3f9b-45db-bb29-1609bc82b62a\") " pod="openstack/manila-scheduler-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.014067 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmh5l\" (UniqueName: \"kubernetes.io/projected/bc52704a-a1a3-4a9f-91d5-05035ea65015-kube-api-access-hmh5l\") pod \"manila-share-share1-0\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " pod="openstack/manila-share-share1-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.014083 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6edad84-3f9b-45db-bb29-1609bc82b62a-scripts\") pod \"manila-scheduler-0\" (UID: \"c6edad84-3f9b-45db-bb29-1609bc82b62a\") " pod="openstack/manila-scheduler-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.014113 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bc52704a-a1a3-4a9f-91d5-05035ea65015-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " pod="openstack/manila-share-share1-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.014135 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7186e5b3-1cc5-422b-8151-4a873bf08a6a-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-dd8k9\" (UID: \"7186e5b3-1cc5-422b-8151-4a873bf08a6a\") " pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.024317 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-dd8k9"] Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.086313 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.087954 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.090497 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.114199 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.115400 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7186e5b3-1cc5-422b-8151-4a873bf08a6a-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-dd8k9\" (UID: \"7186e5b3-1cc5-422b-8151-4a873bf08a6a\") " pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.115454 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl9tg\" (UniqueName: \"kubernetes.io/projected/c6edad84-3f9b-45db-bb29-1609bc82b62a-kube-api-access-zl9tg\") pod \"manila-scheduler-0\" (UID: \"c6edad84-3f9b-45db-bb29-1609bc82b62a\") " pod="openstack/manila-scheduler-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.115474 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6edad84-3f9b-45db-bb29-1609bc82b62a-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"c6edad84-3f9b-45db-bb29-1609bc82b62a\") " pod="openstack/manila-scheduler-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.115493 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6edad84-3f9b-45db-bb29-1609bc82b62a-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"c6edad84-3f9b-45db-bb29-1609bc82b62a\") " pod="openstack/manila-scheduler-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.115515 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6edad84-3f9b-45db-bb29-1609bc82b62a-config-data\") pod \"manila-scheduler-0\" (UID: \"c6edad84-3f9b-45db-bb29-1609bc82b62a\") " pod="openstack/manila-scheduler-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.115532 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmh5l\" (UniqueName: \"kubernetes.io/projected/bc52704a-a1a3-4a9f-91d5-05035ea65015-kube-api-access-hmh5l\") pod \"manila-share-share1-0\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " pod="openstack/manila-share-share1-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.115548 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6edad84-3f9b-45db-bb29-1609bc82b62a-scripts\") pod \"manila-scheduler-0\" (UID: \"c6edad84-3f9b-45db-bb29-1609bc82b62a\") " pod="openstack/manila-scheduler-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.115568 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bc52704a-a1a3-4a9f-91d5-05035ea65015-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " pod="openstack/manila-share-share1-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.115586 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7186e5b3-1cc5-422b-8151-4a873bf08a6a-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-dd8k9\" (UID: \"7186e5b3-1cc5-422b-8151-4a873bf08a6a\") " pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.115610 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc52704a-a1a3-4a9f-91d5-05035ea65015-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " pod="openstack/manila-share-share1-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.115639 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bc52704a-a1a3-4a9f-91d5-05035ea65015-ceph\") pod \"manila-share-share1-0\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " pod="openstack/manila-share-share1-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.115653 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bc52704a-a1a3-4a9f-91d5-05035ea65015-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " pod="openstack/manila-share-share1-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.115698 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7186e5b3-1cc5-422b-8151-4a873bf08a6a-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-dd8k9\" (UID: \"7186e5b3-1cc5-422b-8151-4a873bf08a6a\") " pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.115718 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c6edad84-3f9b-45db-bb29-1609bc82b62a-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"c6edad84-3f9b-45db-bb29-1609bc82b62a\") " pod="openstack/manila-scheduler-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.115733 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc52704a-a1a3-4a9f-91d5-05035ea65015-scripts\") pod \"manila-share-share1-0\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " pod="openstack/manila-share-share1-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.115762 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc52704a-a1a3-4a9f-91d5-05035ea65015-config-data\") pod \"manila-share-share1-0\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " pod="openstack/manila-share-share1-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.115779 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/bc52704a-a1a3-4a9f-91d5-05035ea65015-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " pod="openstack/manila-share-share1-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.115796 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7186e5b3-1cc5-422b-8151-4a873bf08a6a-config\") pod \"dnsmasq-dns-76b5fdb995-dd8k9\" (UID: \"7186e5b3-1cc5-422b-8151-4a873bf08a6a\") " pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.115818 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7m9z\" (UniqueName: \"kubernetes.io/projected/7186e5b3-1cc5-422b-8151-4a873bf08a6a-kube-api-access-c7m9z\") pod \"dnsmasq-dns-76b5fdb995-dd8k9\" (UID: \"7186e5b3-1cc5-422b-8151-4a873bf08a6a\") " pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.115857 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7186e5b3-1cc5-422b-8151-4a873bf08a6a-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-dd8k9\" (UID: \"7186e5b3-1cc5-422b-8151-4a873bf08a6a\") " pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.116701 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7186e5b3-1cc5-422b-8151-4a873bf08a6a-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-dd8k9\" (UID: \"7186e5b3-1cc5-422b-8151-4a873bf08a6a\") " pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.117212 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7186e5b3-1cc5-422b-8151-4a873bf08a6a-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-dd8k9\" (UID: \"7186e5b3-1cc5-422b-8151-4a873bf08a6a\") " pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.124952 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c6edad84-3f9b-45db-bb29-1609bc82b62a-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"c6edad84-3f9b-45db-bb29-1609bc82b62a\") " pod="openstack/manila-scheduler-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.125283 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bc52704a-a1a3-4a9f-91d5-05035ea65015-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " pod="openstack/manila-share-share1-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.125994 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/bc52704a-a1a3-4a9f-91d5-05035ea65015-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " pod="openstack/manila-share-share1-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.126126 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7186e5b3-1cc5-422b-8151-4a873bf08a6a-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-dd8k9\" (UID: \"7186e5b3-1cc5-422b-8151-4a873bf08a6a\") " pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.130254 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7186e5b3-1cc5-422b-8151-4a873bf08a6a-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-dd8k9\" (UID: \"7186e5b3-1cc5-422b-8151-4a873bf08a6a\") " pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.130414 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7186e5b3-1cc5-422b-8151-4a873bf08a6a-config\") pod \"dnsmasq-dns-76b5fdb995-dd8k9\" (UID: \"7186e5b3-1cc5-422b-8151-4a873bf08a6a\") " pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.131614 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc52704a-a1a3-4a9f-91d5-05035ea65015-scripts\") pod \"manila-share-share1-0\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " pod="openstack/manila-share-share1-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.131752 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6edad84-3f9b-45db-bb29-1609bc82b62a-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"c6edad84-3f9b-45db-bb29-1609bc82b62a\") " pod="openstack/manila-scheduler-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.133484 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bc52704a-a1a3-4a9f-91d5-05035ea65015-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " pod="openstack/manila-share-share1-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.136007 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6edad84-3f9b-45db-bb29-1609bc82b62a-scripts\") pod \"manila-scheduler-0\" (UID: \"c6edad84-3f9b-45db-bb29-1609bc82b62a\") " pod="openstack/manila-scheduler-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.136021 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl9tg\" (UniqueName: \"kubernetes.io/projected/c6edad84-3f9b-45db-bb29-1609bc82b62a-kube-api-access-zl9tg\") pod \"manila-scheduler-0\" (UID: \"c6edad84-3f9b-45db-bb29-1609bc82b62a\") " pod="openstack/manila-scheduler-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.139262 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc52704a-a1a3-4a9f-91d5-05035ea65015-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " pod="openstack/manila-share-share1-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.140662 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6edad84-3f9b-45db-bb29-1609bc82b62a-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"c6edad84-3f9b-45db-bb29-1609bc82b62a\") " pod="openstack/manila-scheduler-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.144471 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bc52704a-a1a3-4a9f-91d5-05035ea65015-ceph\") pod \"manila-share-share1-0\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " pod="openstack/manila-share-share1-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.148454 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6edad84-3f9b-45db-bb29-1609bc82b62a-config-data\") pod \"manila-scheduler-0\" (UID: \"c6edad84-3f9b-45db-bb29-1609bc82b62a\") " pod="openstack/manila-scheduler-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.149896 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc52704a-a1a3-4a9f-91d5-05035ea65015-config-data\") pod \"manila-share-share1-0\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " pod="openstack/manila-share-share1-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.151288 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7m9z\" (UniqueName: \"kubernetes.io/projected/7186e5b3-1cc5-422b-8151-4a873bf08a6a-kube-api-access-c7m9z\") pod \"dnsmasq-dns-76b5fdb995-dd8k9\" (UID: \"7186e5b3-1cc5-422b-8151-4a873bf08a6a\") " pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.175414 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmh5l\" (UniqueName: \"kubernetes.io/projected/bc52704a-a1a3-4a9f-91d5-05035ea65015-kube-api-access-hmh5l\") pod \"manila-share-share1-0\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " pod="openstack/manila-share-share1-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.218014 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " pod="openstack/manila-api-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.218084 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v55xm\" (UniqueName: \"kubernetes.io/projected/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-kube-api-access-v55xm\") pod \"manila-api-0\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " pod="openstack/manila-api-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.218112 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-scripts\") pod \"manila-api-0\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " pod="openstack/manila-api-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.218146 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-logs\") pod \"manila-api-0\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " pod="openstack/manila-api-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.218189 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-config-data-custom\") pod \"manila-api-0\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " pod="openstack/manila-api-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.218235 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-etc-machine-id\") pod \"manila-api-0\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " pod="openstack/manila-api-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.218266 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-config-data\") pod \"manila-api-0\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " pod="openstack/manila-api-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.227071 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.330580 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.333806 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " pod="openstack/manila-api-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.333873 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v55xm\" (UniqueName: \"kubernetes.io/projected/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-kube-api-access-v55xm\") pod \"manila-api-0\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " pod="openstack/manila-api-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.340726 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " pod="openstack/manila-api-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.343106 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-scripts\") pod \"manila-api-0\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " pod="openstack/manila-api-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.343220 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-logs\") pod \"manila-api-0\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " pod="openstack/manila-api-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.343301 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-config-data-custom\") pod \"manila-api-0\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " pod="openstack/manila-api-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.343354 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-etc-machine-id\") pod \"manila-api-0\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " pod="openstack/manila-api-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.343401 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-config-data\") pod \"manila-api-0\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " pod="openstack/manila-api-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.343844 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-logs\") pod \"manila-api-0\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " pod="openstack/manila-api-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.343852 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-scripts\") pod \"manila-api-0\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " pod="openstack/manila-api-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.344002 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-etc-machine-id\") pod \"manila-api-0\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " pod="openstack/manila-api-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.373295 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v55xm\" (UniqueName: \"kubernetes.io/projected/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-kube-api-access-v55xm\") pod \"manila-api-0\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " pod="openstack/manila-api-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.376465 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-config-data-custom\") pod \"manila-api-0\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " pod="openstack/manila-api-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.384414 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-config-data\") pod \"manila-api-0\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " pod="openstack/manila-api-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.408029 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 10 17:16:40 crc kubenswrapper[5036]: I0110 17:16:40.451168 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 10 17:16:41 crc kubenswrapper[5036]: I0110 17:16:41.018855 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Jan 10 17:16:41 crc kubenswrapper[5036]: I0110 17:16:41.192401 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-dd8k9"] Jan 10 17:16:41 crc kubenswrapper[5036]: I0110 17:16:41.402046 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" event={"ID":"7186e5b3-1cc5-422b-8151-4a873bf08a6a","Type":"ContainerStarted","Data":"a6ead812863d81e2e4e5fe04669a282ba1cdf8de64b6214fc27776167cc45d1f"} Jan 10 17:16:41 crc kubenswrapper[5036]: I0110 17:16:41.409671 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"c6edad84-3f9b-45db-bb29-1609bc82b62a","Type":"ContainerStarted","Data":"e9b535553a3805673a0e16f5b6003039d5d747d7ff60f5c476ea1e9d45dcb284"} Jan 10 17:16:41 crc kubenswrapper[5036]: I0110 17:16:41.428978 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Jan 10 17:16:42 crc kubenswrapper[5036]: I0110 17:16:42.568259 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"c6edad84-3f9b-45db-bb29-1609bc82b62a","Type":"ContainerStarted","Data":"8e09b10bef9a4b51a3ba15d929999eb7610813694259941804d2786a22120d1a"} Jan 10 17:16:42 crc kubenswrapper[5036]: I0110 17:16:42.568816 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64","Type":"ContainerStarted","Data":"503264e9473b551c0ab44c823a1ec792d3fbe5c573a217ef618d204abfe12643"} Jan 10 17:16:42 crc kubenswrapper[5036]: I0110 17:16:42.568829 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64","Type":"ContainerStarted","Data":"a0f79093f2e900e453e941ab63fe22998d8fc8bcb7dbb6409fa6cf202347e0aa"} Jan 10 17:16:42 crc kubenswrapper[5036]: I0110 17:16:42.568975 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Jan 10 17:16:42 crc kubenswrapper[5036]: I0110 17:16:42.587958 5036 generic.go:334] "Generic (PLEG): container finished" podID="7186e5b3-1cc5-422b-8151-4a873bf08a6a" containerID="46f71683b8b1abff55e112605c647f597c0abbd9d703a6f8ddf485aa21e96276" exitCode=0 Jan 10 17:16:42 crc kubenswrapper[5036]: I0110 17:16:42.588033 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" event={"ID":"7186e5b3-1cc5-422b-8151-4a873bf08a6a","Type":"ContainerDied","Data":"46f71683b8b1abff55e112605c647f597c0abbd9d703a6f8ddf485aa21e96276"} Jan 10 17:16:43 crc kubenswrapper[5036]: I0110 17:16:43.124844 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Jan 10 17:16:43 crc kubenswrapper[5036]: I0110 17:16:43.611312 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" event={"ID":"7186e5b3-1cc5-422b-8151-4a873bf08a6a","Type":"ContainerStarted","Data":"7cb04e24f4a376da993b9fb21a3405ba2873dde1fc2b8023a690d1ac51339359"} Jan 10 17:16:43 crc kubenswrapper[5036]: I0110 17:16:43.611530 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" Jan 10 17:16:43 crc kubenswrapper[5036]: I0110 17:16:43.613960 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"c6edad84-3f9b-45db-bb29-1609bc82b62a","Type":"ContainerStarted","Data":"dcfb9ca570fa1c0a9bbc395b73a466d9a14d83f8911a13f26dd45c1f4b84a156"} Jan 10 17:16:43 crc kubenswrapper[5036]: I0110 17:16:43.616609 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"bc52704a-a1a3-4a9f-91d5-05035ea65015","Type":"ContainerStarted","Data":"9e1578f3fb7f69deed21d32cf4f979316fb97e0282aa5aff52d4608d6d119aa6"} Jan 10 17:16:43 crc kubenswrapper[5036]: I0110 17:16:43.618170 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64","Type":"ContainerStarted","Data":"c1bf2f8247286235a17f274ef8050a60e392d17dec7d6fc0c385d172464a2196"} Jan 10 17:16:43 crc kubenswrapper[5036]: I0110 17:16:43.618668 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Jan 10 17:16:43 crc kubenswrapper[5036]: I0110 17:16:43.654747 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.65473322 podStartE2EDuration="3.65473322s" podCreationTimestamp="2026-01-10 17:16:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 17:16:43.65265418 +0000 UTC m=+2925.522889674" watchObservedRunningTime="2026-01-10 17:16:43.65473322 +0000 UTC m=+2925.524968714" Jan 10 17:16:43 crc kubenswrapper[5036]: I0110 17:16:43.659635 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" podStartSLOduration=4.659623949 podStartE2EDuration="4.659623949s" podCreationTimestamp="2026-01-10 17:16:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 17:16:43.632183216 +0000 UTC m=+2925.502418710" watchObservedRunningTime="2026-01-10 17:16:43.659623949 +0000 UTC m=+2925.529859443" Jan 10 17:16:43 crc kubenswrapper[5036]: I0110 17:16:43.674036 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.923623171 podStartE2EDuration="4.67402144s" podCreationTimestamp="2026-01-10 17:16:39 +0000 UTC" firstStartedPulling="2026-01-10 17:16:41.045346416 +0000 UTC m=+2922.915581910" lastFinishedPulling="2026-01-10 17:16:41.795744685 +0000 UTC m=+2923.665980179" observedRunningTime="2026-01-10 17:16:43.669185012 +0000 UTC m=+2925.539420506" watchObservedRunningTime="2026-01-10 17:16:43.67402144 +0000 UTC m=+2925.544256934" Jan 10 17:16:44 crc kubenswrapper[5036]: I0110 17:16:44.628484 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64" containerName="manila-api-log" containerID="cri-o://503264e9473b551c0ab44c823a1ec792d3fbe5c573a217ef618d204abfe12643" gracePeriod=30 Jan 10 17:16:44 crc kubenswrapper[5036]: I0110 17:16:44.628964 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64" containerName="manila-api" containerID="cri-o://c1bf2f8247286235a17f274ef8050a60e392d17dec7d6fc0c385d172464a2196" gracePeriod=30 Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.421197 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.492181 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v55xm\" (UniqueName: \"kubernetes.io/projected/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-kube-api-access-v55xm\") pod \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.492222 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-scripts\") pod \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.492272 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-etc-machine-id\") pod \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.492291 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-combined-ca-bundle\") pod \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.492331 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-config-data\") pod \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.492364 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-logs\") pod \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.492382 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-config-data-custom\") pod \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\" (UID: \"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64\") " Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.492429 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64" (UID: "20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.492650 5036 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.492872 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-logs" (OuterVolumeSpecName: "logs") pod "20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64" (UID: "20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.500850 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-scripts" (OuterVolumeSpecName: "scripts") pod "20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64" (UID: "20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.500942 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64" (UID: "20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.501049 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-kube-api-access-v55xm" (OuterVolumeSpecName: "kube-api-access-v55xm") pod "20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64" (UID: "20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64"). InnerVolumeSpecName "kube-api-access-v55xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.551810 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-config-data" (OuterVolumeSpecName: "config-data") pod "20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64" (UID: "20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.583169 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64" (UID: "20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.594036 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v55xm\" (UniqueName: \"kubernetes.io/projected/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-kube-api-access-v55xm\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.594069 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.594079 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.594087 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.594096 5036 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-logs\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.594105 5036 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.657523 5036 generic.go:334] "Generic (PLEG): container finished" podID="20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64" containerID="c1bf2f8247286235a17f274ef8050a60e392d17dec7d6fc0c385d172464a2196" exitCode=0 Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.657568 5036 generic.go:334] "Generic (PLEG): container finished" podID="20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64" containerID="503264e9473b551c0ab44c823a1ec792d3fbe5c573a217ef618d204abfe12643" exitCode=143 Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.657585 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.657592 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64","Type":"ContainerDied","Data":"c1bf2f8247286235a17f274ef8050a60e392d17dec7d6fc0c385d172464a2196"} Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.657720 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64","Type":"ContainerDied","Data":"503264e9473b551c0ab44c823a1ec792d3fbe5c573a217ef618d204abfe12643"} Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.657736 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64","Type":"ContainerDied","Data":"a0f79093f2e900e453e941ab63fe22998d8fc8bcb7dbb6409fa6cf202347e0aa"} Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.657753 5036 scope.go:117] "RemoveContainer" containerID="c1bf2f8247286235a17f274ef8050a60e392d17dec7d6fc0c385d172464a2196" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.693896 5036 scope.go:117] "RemoveContainer" containerID="503264e9473b551c0ab44c823a1ec792d3fbe5c573a217ef618d204abfe12643" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.695763 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.709650 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.719886 5036 scope.go:117] "RemoveContainer" containerID="c1bf2f8247286235a17f274ef8050a60e392d17dec7d6fc0c385d172464a2196" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.724779 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Jan 10 17:16:45 crc kubenswrapper[5036]: E0110 17:16:45.725002 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1bf2f8247286235a17f274ef8050a60e392d17dec7d6fc0c385d172464a2196\": container with ID starting with c1bf2f8247286235a17f274ef8050a60e392d17dec7d6fc0c385d172464a2196 not found: ID does not exist" containerID="c1bf2f8247286235a17f274ef8050a60e392d17dec7d6fc0c385d172464a2196" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.725043 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1bf2f8247286235a17f274ef8050a60e392d17dec7d6fc0c385d172464a2196"} err="failed to get container status \"c1bf2f8247286235a17f274ef8050a60e392d17dec7d6fc0c385d172464a2196\": rpc error: code = NotFound desc = could not find container \"c1bf2f8247286235a17f274ef8050a60e392d17dec7d6fc0c385d172464a2196\": container with ID starting with c1bf2f8247286235a17f274ef8050a60e392d17dec7d6fc0c385d172464a2196 not found: ID does not exist" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.725069 5036 scope.go:117] "RemoveContainer" containerID="503264e9473b551c0ab44c823a1ec792d3fbe5c573a217ef618d204abfe12643" Jan 10 17:16:45 crc kubenswrapper[5036]: E0110 17:16:45.725203 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64" containerName="manila-api" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.725218 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64" containerName="manila-api" Jan 10 17:16:45 crc kubenswrapper[5036]: E0110 17:16:45.725229 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64" containerName="manila-api-log" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.725238 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64" containerName="manila-api-log" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.725418 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64" containerName="manila-api" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.725436 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64" containerName="manila-api-log" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.726394 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 10 17:16:45 crc kubenswrapper[5036]: E0110 17:16:45.726549 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"503264e9473b551c0ab44c823a1ec792d3fbe5c573a217ef618d204abfe12643\": container with ID starting with 503264e9473b551c0ab44c823a1ec792d3fbe5c573a217ef618d204abfe12643 not found: ID does not exist" containerID="503264e9473b551c0ab44c823a1ec792d3fbe5c573a217ef618d204abfe12643" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.726588 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"503264e9473b551c0ab44c823a1ec792d3fbe5c573a217ef618d204abfe12643"} err="failed to get container status \"503264e9473b551c0ab44c823a1ec792d3fbe5c573a217ef618d204abfe12643\": rpc error: code = NotFound desc = could not find container \"503264e9473b551c0ab44c823a1ec792d3fbe5c573a217ef618d204abfe12643\": container with ID starting with 503264e9473b551c0ab44c823a1ec792d3fbe5c573a217ef618d204abfe12643 not found: ID does not exist" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.726616 5036 scope.go:117] "RemoveContainer" containerID="c1bf2f8247286235a17f274ef8050a60e392d17dec7d6fc0c385d172464a2196" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.727127 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1bf2f8247286235a17f274ef8050a60e392d17dec7d6fc0c385d172464a2196"} err="failed to get container status \"c1bf2f8247286235a17f274ef8050a60e392d17dec7d6fc0c385d172464a2196\": rpc error: code = NotFound desc = could not find container \"c1bf2f8247286235a17f274ef8050a60e392d17dec7d6fc0c385d172464a2196\": container with ID starting with c1bf2f8247286235a17f274ef8050a60e392d17dec7d6fc0c385d172464a2196 not found: ID does not exist" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.727152 5036 scope.go:117] "RemoveContainer" containerID="503264e9473b551c0ab44c823a1ec792d3fbe5c573a217ef618d204abfe12643" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.728190 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.728359 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"503264e9473b551c0ab44c823a1ec792d3fbe5c573a217ef618d204abfe12643"} err="failed to get container status \"503264e9473b551c0ab44c823a1ec792d3fbe5c573a217ef618d204abfe12643\": rpc error: code = NotFound desc = could not find container \"503264e9473b551c0ab44c823a1ec792d3fbe5c573a217ef618d204abfe12643\": container with ID starting with 503264e9473b551c0ab44c823a1ec792d3fbe5c573a217ef618d204abfe12643 not found: ID does not exist" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.732846 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.733010 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.733762 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.900450 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dkm5\" (UniqueName: \"kubernetes.io/projected/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-kube-api-access-6dkm5\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.900860 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-internal-tls-certs\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.900924 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-etc-machine-id\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.900943 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-public-tls-certs\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.900961 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-logs\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.900991 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-config-data\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.901008 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.901399 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-config-data-custom\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:45 crc kubenswrapper[5036]: I0110 17:16:45.901449 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-scripts\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:46 crc kubenswrapper[5036]: I0110 17:16:46.141967 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-scripts\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:46 crc kubenswrapper[5036]: I0110 17:16:46.142043 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dkm5\" (UniqueName: \"kubernetes.io/projected/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-kube-api-access-6dkm5\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:46 crc kubenswrapper[5036]: I0110 17:16:46.142070 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-internal-tls-certs\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:46 crc kubenswrapper[5036]: I0110 17:16:46.142118 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-etc-machine-id\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:46 crc kubenswrapper[5036]: I0110 17:16:46.142139 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-public-tls-certs\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:46 crc kubenswrapper[5036]: I0110 17:16:46.142153 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-logs\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:46 crc kubenswrapper[5036]: I0110 17:16:46.142176 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-config-data\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:46 crc kubenswrapper[5036]: I0110 17:16:46.142194 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:46 crc kubenswrapper[5036]: I0110 17:16:46.142227 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-config-data-custom\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:46 crc kubenswrapper[5036]: I0110 17:16:46.143449 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-etc-machine-id\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:46 crc kubenswrapper[5036]: I0110 17:16:46.144194 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-logs\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:46 crc kubenswrapper[5036]: I0110 17:16:46.150357 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-config-data-custom\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:46 crc kubenswrapper[5036]: I0110 17:16:46.150953 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-public-tls-certs\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:46 crc kubenswrapper[5036]: I0110 17:16:46.152133 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-scripts\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:46 crc kubenswrapper[5036]: I0110 17:16:46.152954 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-internal-tls-certs\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:46 crc kubenswrapper[5036]: I0110 17:16:46.155342 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-config-data\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:46 crc kubenswrapper[5036]: I0110 17:16:46.155863 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:46 crc kubenswrapper[5036]: I0110 17:16:46.168193 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dkm5\" (UniqueName: \"kubernetes.io/projected/32551bcd-e5f3-445c-b4d2-d4ac138a54ce-kube-api-access-6dkm5\") pod \"manila-api-0\" (UID: \"32551bcd-e5f3-445c-b4d2-d4ac138a54ce\") " pod="openstack/manila-api-0" Jan 10 17:16:46 crc kubenswrapper[5036]: I0110 17:16:46.412318 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 10 17:16:46 crc kubenswrapper[5036]: I0110 17:16:46.537359 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64" path="/var/lib/kubelet/pods/20c57aad-c4c0-4f96-bb0a-a8ddbb6b8a64/volumes" Jan 10 17:16:46 crc kubenswrapper[5036]: I0110 17:16:46.986789 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Jan 10 17:16:47 crc kubenswrapper[5036]: I0110 17:16:47.340073 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 10 17:16:47 crc kubenswrapper[5036]: I0110 17:16:47.340662 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" containerName="ceilometer-central-agent" containerID="cri-o://5489c0ae77ae17200383fd972a712757b379a5d38a6f74632ffa477d87e91865" gracePeriod=30 Jan 10 17:16:47 crc kubenswrapper[5036]: I0110 17:16:47.340797 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" containerName="proxy-httpd" containerID="cri-o://2f69a058e801b8c19d6ee86a897a2e0c0a251d3c018051982d1e811a6a348011" gracePeriod=30 Jan 10 17:16:47 crc kubenswrapper[5036]: I0110 17:16:47.340858 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" containerName="sg-core" containerID="cri-o://87f287e589de8e951628d2f6583bc39b1ce289181410c6c1d0d58d4cf4352c9c" gracePeriod=30 Jan 10 17:16:47 crc kubenswrapper[5036]: I0110 17:16:47.340897 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" containerName="ceilometer-notification-agent" containerID="cri-o://bdb6d471dc67978377d531b907db8ae43857a59da3fc030c32e9911c851c92df" gracePeriod=30 Jan 10 17:16:47 crc kubenswrapper[5036]: I0110 17:16:47.697927 5036 generic.go:334] "Generic (PLEG): container finished" podID="2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" containerID="2f69a058e801b8c19d6ee86a897a2e0c0a251d3c018051982d1e811a6a348011" exitCode=0 Jan 10 17:16:47 crc kubenswrapper[5036]: I0110 17:16:47.698299 5036 generic.go:334] "Generic (PLEG): container finished" podID="2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" containerID="87f287e589de8e951628d2f6583bc39b1ce289181410c6c1d0d58d4cf4352c9c" exitCode=2 Jan 10 17:16:47 crc kubenswrapper[5036]: I0110 17:16:47.698005 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7","Type":"ContainerDied","Data":"2f69a058e801b8c19d6ee86a897a2e0c0a251d3c018051982d1e811a6a348011"} Jan 10 17:16:47 crc kubenswrapper[5036]: I0110 17:16:47.698361 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7","Type":"ContainerDied","Data":"87f287e589de8e951628d2f6583bc39b1ce289181410c6c1d0d58d4cf4352c9c"} Jan 10 17:16:47 crc kubenswrapper[5036]: I0110 17:16:47.700585 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"32551bcd-e5f3-445c-b4d2-d4ac138a54ce","Type":"ContainerStarted","Data":"f29540a4aa48b9a45bda674658ebf7553e32f51a65d6a840b4f66d2d8b8453d7"} Jan 10 17:16:47 crc kubenswrapper[5036]: I0110 17:16:47.700633 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"32551bcd-e5f3-445c-b4d2-d4ac138a54ce","Type":"ContainerStarted","Data":"cba632fb4fb1daa9701aeb13483bc5c51a2212602fc7b9a4ddaa564679198d47"} Jan 10 17:16:48 crc kubenswrapper[5036]: I0110 17:16:48.738272 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"32551bcd-e5f3-445c-b4d2-d4ac138a54ce","Type":"ContainerStarted","Data":"9ecc454f4024be3847bc9b5c6181eb4897590a1a6916ec047702c52b18309351"} Jan 10 17:16:48 crc kubenswrapper[5036]: I0110 17:16:48.738656 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Jan 10 17:16:48 crc kubenswrapper[5036]: I0110 17:16:48.754587 5036 generic.go:334] "Generic (PLEG): container finished" podID="2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" containerID="5489c0ae77ae17200383fd972a712757b379a5d38a6f74632ffa477d87e91865" exitCode=0 Jan 10 17:16:48 crc kubenswrapper[5036]: I0110 17:16:48.754647 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7","Type":"ContainerDied","Data":"5489c0ae77ae17200383fd972a712757b379a5d38a6f74632ffa477d87e91865"} Jan 10 17:16:48 crc kubenswrapper[5036]: I0110 17:16:48.759425 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.75940656 podStartE2EDuration="3.75940656s" podCreationTimestamp="2026-01-10 17:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 17:16:48.758289119 +0000 UTC m=+2930.628524633" watchObservedRunningTime="2026-01-10 17:16:48.75940656 +0000 UTC m=+2930.629642044" Jan 10 17:16:49 crc kubenswrapper[5036]: I0110 17:16:49.998791 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:50 crc kubenswrapper[5036]: I0110 17:16:50.023202 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:50 crc kubenswrapper[5036]: I0110 17:16:50.227803 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Jan 10 17:16:50 crc kubenswrapper[5036]: I0110 17:16:50.333827 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76b5fdb995-dd8k9" Jan 10 17:16:50 crc kubenswrapper[5036]: I0110 17:16:50.489627 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-pg8hr"] Jan 10 17:16:50 crc kubenswrapper[5036]: I0110 17:16:50.489955 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" podUID="a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0" containerName="dnsmasq-dns" containerID="cri-o://8f744aa4622ad38aa33f50eeef3b061d0b282c6c7b5418edefd6b2e8902fa5a5" gracePeriod=10 Jan 10 17:16:50 crc kubenswrapper[5036]: I0110 17:16:50.774551 5036 generic.go:334] "Generic (PLEG): container finished" podID="a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0" containerID="8f744aa4622ad38aa33f50eeef3b061d0b282c6c7b5418edefd6b2e8902fa5a5" exitCode=0 Jan 10 17:16:50 crc kubenswrapper[5036]: I0110 17:16:50.774768 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" event={"ID":"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0","Type":"ContainerDied","Data":"8f744aa4622ad38aa33f50eeef3b061d0b282c6c7b5418edefd6b2e8902fa5a5"} Jan 10 17:16:51 crc kubenswrapper[5036]: I0110 17:16:51.790208 5036 generic.go:334] "Generic (PLEG): container finished" podID="2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" containerID="bdb6d471dc67978377d531b907db8ae43857a59da3fc030c32e9911c851c92df" exitCode=0 Jan 10 17:16:51 crc kubenswrapper[5036]: I0110 17:16:51.790277 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7","Type":"ContainerDied","Data":"bdb6d471dc67978377d531b907db8ae43857a59da3fc030c32e9911c851c92df"} Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.078903 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5bcc8455c4-njd4j" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.080195 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.131772 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79f74f6ffb-kzjrv"] Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.454129 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.539812 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.640290 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-dns-svc\") pod \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\" (UID: \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\") " Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.640392 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh527\" (UniqueName: \"kubernetes.io/projected/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-kube-api-access-qh527\") pod \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\" (UID: \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\") " Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.640430 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-scripts\") pod \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.640465 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-ovsdbserver-sb\") pod \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\" (UID: \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\") " Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.640526 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-run-httpd\") pod \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.640576 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-sg-core-conf-yaml\") pod \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.640609 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-config\") pod \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\" (UID: \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\") " Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.640624 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-ceilometer-tls-certs\") pod \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.640638 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-openstack-edpm-ipam\") pod \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\" (UID: \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\") " Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.640667 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-config-data\") pod \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.640734 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-log-httpd\") pod \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.640763 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-combined-ca-bundle\") pod \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.640802 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-ovsdbserver-nb\") pod \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\" (UID: \"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0\") " Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.640824 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxh7l\" (UniqueName: \"kubernetes.io/projected/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-kube-api-access-cxh7l\") pod \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\" (UID: \"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7\") " Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.645261 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" (UID: "2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.646395 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" (UID: "2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.653286 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-kube-api-access-cxh7l" (OuterVolumeSpecName: "kube-api-access-cxh7l") pod "2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" (UID: "2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7"). InnerVolumeSpecName "kube-api-access-cxh7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.654036 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-kube-api-access-qh527" (OuterVolumeSpecName: "kube-api-access-qh527") pod "a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0" (UID: "a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0"). InnerVolumeSpecName "kube-api-access-qh527". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.654144 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-scripts" (OuterVolumeSpecName: "scripts") pod "2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" (UID: "2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.742712 5036 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.742745 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxh7l\" (UniqueName: \"kubernetes.io/projected/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-kube-api-access-cxh7l\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.742756 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh527\" (UniqueName: \"kubernetes.io/projected/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-kube-api-access-qh527\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.742764 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.742773 5036 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.766143 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" (UID: "2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.781526 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77c6f556bf-gmpft" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.783261 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0" (UID: "a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.783532 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-config" (OuterVolumeSpecName: "config") pod "a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0" (UID: "a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.794874 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0" (UID: "a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.812289 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0" (UID: "a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.812360 5036 generic.go:334] "Generic (PLEG): container finished" podID="d4911ee6-6ecd-40da-a12b-f6a79cdaa201" containerID="73db86f898d4868807c8cb5337217949e7654dcf3bbcb3521f1ffef7a31582d6" exitCode=137 Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.812392 5036 generic.go:334] "Generic (PLEG): container finished" podID="d4911ee6-6ecd-40da-a12b-f6a79cdaa201" containerID="088b6f96a798183b597c6041d9cff8936c5b19f8ab1c3680df9108d8aca41ab0" exitCode=137 Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.812465 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78fc99bfc7-mctlg" event={"ID":"d4911ee6-6ecd-40da-a12b-f6a79cdaa201","Type":"ContainerDied","Data":"73db86f898d4868807c8cb5337217949e7654dcf3bbcb3521f1ffef7a31582d6"} Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.812494 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78fc99bfc7-mctlg" event={"ID":"d4911ee6-6ecd-40da-a12b-f6a79cdaa201","Type":"ContainerDied","Data":"088b6f96a798183b597c6041d9cff8936c5b19f8ab1c3680df9108d8aca41ab0"} Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.812505 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78fc99bfc7-mctlg" event={"ID":"d4911ee6-6ecd-40da-a12b-f6a79cdaa201","Type":"ContainerDied","Data":"66c1fae10f0f71a2f03b6a7963b9404b90bfeec08c79d9a82af2a2db43f3f5b2"} Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.812515 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66c1fae10f0f71a2f03b6a7963b9404b90bfeec08c79d9a82af2a2db43f3f5b2" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.813016 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0" (UID: "a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.816844 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" (UID: "2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.830941 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" (UID: "2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.831106 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.831133 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-pg8hr" event={"ID":"a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0","Type":"ContainerDied","Data":"87561d71935330ec0435fa37ce00337a476051cde22a0b9c143bc9a8e880f532"} Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.831168 5036 scope.go:117] "RemoveContainer" containerID="8f744aa4622ad38aa33f50eeef3b061d0b282c6c7b5418edefd6b2e8902fa5a5" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.838106 5036 generic.go:334] "Generic (PLEG): container finished" podID="dd7a2aaf-b591-4388-aaf0-f94c930032b5" containerID="067a0e7f5115e9db1753408dad5f266e433a2695546cabe6cd61427335199b49" exitCode=137 Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.838137 5036 generic.go:334] "Generic (PLEG): container finished" podID="dd7a2aaf-b591-4388-aaf0-f94c930032b5" containerID="899b15ac73aa82209ed544a962d2d1353aa937a6496dffec4bef0aaa7ac9dab4" exitCode=137 Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.838184 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77c6f556bf-gmpft" event={"ID":"dd7a2aaf-b591-4388-aaf0-f94c930032b5","Type":"ContainerDied","Data":"067a0e7f5115e9db1753408dad5f266e433a2695546cabe6cd61427335199b49"} Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.838216 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77c6f556bf-gmpft" event={"ID":"dd7a2aaf-b591-4388-aaf0-f94c930032b5","Type":"ContainerDied","Data":"899b15ac73aa82209ed544a962d2d1353aa937a6496dffec4bef0aaa7ac9dab4"} Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.838226 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77c6f556bf-gmpft" event={"ID":"dd7a2aaf-b591-4388-aaf0-f94c930032b5","Type":"ContainerDied","Data":"1b50597387f9b9041d612d5d50cdff3a2e840456665674b4b85ba9b608c75ab9"} Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.838305 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77c6f556bf-gmpft" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.839284 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78fc99bfc7-mctlg" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.843493 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd7a2aaf-b591-4388-aaf0-f94c930032b5-horizon-secret-key\") pod \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\" (UID: \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\") " Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.843538 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd7a2aaf-b591-4388-aaf0-f94c930032b5-logs\") pod \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\" (UID: \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\") " Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.843735 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd7a2aaf-b591-4388-aaf0-f94c930032b5-scripts\") pod \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\" (UID: \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\") " Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.844280 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9nrq\" (UniqueName: \"kubernetes.io/projected/dd7a2aaf-b591-4388-aaf0-f94c930032b5-kube-api-access-s9nrq\") pod \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\" (UID: \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\") " Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.844326 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd7a2aaf-b591-4388-aaf0-f94c930032b5-config-data\") pod \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\" (UID: \"dd7a2aaf-b591-4388-aaf0-f94c930032b5\") " Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.844493 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd7a2aaf-b591-4388-aaf0-f94c930032b5-logs" (OuterVolumeSpecName: "logs") pod "dd7a2aaf-b591-4388-aaf0-f94c930032b5" (UID: "dd7a2aaf-b591-4388-aaf0-f94c930032b5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.844981 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-79f74f6ffb-kzjrv" podUID="b7d588d2-de3c-4aa8-9949-cd2cc17beac6" containerName="horizon-log" containerID="cri-o://c47503d9d239a27615fac186d4db643215e81fe8554510b4c3096096c664f607" gracePeriod=30 Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.845317 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.845484 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7","Type":"ContainerDied","Data":"116920f4f6ce959c10e8564ac2cbd93c94f828e95233e6c67247a14ec0a54b3a"} Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.845556 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-79f74f6ffb-kzjrv" podUID="b7d588d2-de3c-4aa8-9949-cd2cc17beac6" containerName="horizon" containerID="cri-o://33f3578f36c2eac18c7decc55a04cf76ce07f0a11f1219e396288278723db621" gracePeriod=30 Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.846336 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.846351 5036 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.846361 5036 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.846371 5036 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.846379 5036 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd7a2aaf-b591-4388-aaf0-f94c930032b5-logs\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.846388 5036 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.846399 5036 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.846408 5036 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.846417 5036 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0-config\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.851945 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd7a2aaf-b591-4388-aaf0-f94c930032b5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "dd7a2aaf-b591-4388-aaf0-f94c930032b5" (UID: "dd7a2aaf-b591-4388-aaf0-f94c930032b5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.860793 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd7a2aaf-b591-4388-aaf0-f94c930032b5-kube-api-access-s9nrq" (OuterVolumeSpecName: "kube-api-access-s9nrq") pod "dd7a2aaf-b591-4388-aaf0-f94c930032b5" (UID: "dd7a2aaf-b591-4388-aaf0-f94c930032b5"). InnerVolumeSpecName "kube-api-access-s9nrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.863600 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-config-data" (OuterVolumeSpecName: "config-data") pod "2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" (UID: "2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.880878 5036 scope.go:117] "RemoveContainer" containerID="556c96ba7bb7dff6a497d182daf22a4d471d73a458921b83ae68931133573551" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.911445 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd7a2aaf-b591-4388-aaf0-f94c930032b5-scripts" (OuterVolumeSpecName: "scripts") pod "dd7a2aaf-b591-4388-aaf0-f94c930032b5" (UID: "dd7a2aaf-b591-4388-aaf0-f94c930032b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.913603 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-pg8hr"] Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.919907 5036 scope.go:117] "RemoveContainer" containerID="067a0e7f5115e9db1753408dad5f266e433a2695546cabe6cd61427335199b49" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.921710 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-pg8hr"] Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.923441 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd7a2aaf-b591-4388-aaf0-f94c930032b5-config-data" (OuterVolumeSpecName: "config-data") pod "dd7a2aaf-b591-4388-aaf0-f94c930032b5" (UID: "dd7a2aaf-b591-4388-aaf0-f94c930032b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.947429 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-config-data\") pod \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\" (UID: \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\") " Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.947528 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-scripts\") pod \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\" (UID: \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\") " Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.947555 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mggmx\" (UniqueName: \"kubernetes.io/projected/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-kube-api-access-mggmx\") pod \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\" (UID: \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\") " Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.947711 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-horizon-secret-key\") pod \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\" (UID: \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\") " Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.947760 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-logs\") pod \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\" (UID: \"d4911ee6-6ecd-40da-a12b-f6a79cdaa201\") " Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.948219 5036 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd7a2aaf-b591-4388-aaf0-f94c930032b5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.948237 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.948246 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd7a2aaf-b591-4388-aaf0-f94c930032b5-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.948256 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9nrq\" (UniqueName: \"kubernetes.io/projected/dd7a2aaf-b591-4388-aaf0-f94c930032b5-kube-api-access-s9nrq\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.948268 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd7a2aaf-b591-4388-aaf0-f94c930032b5-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.948584 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-logs" (OuterVolumeSpecName: "logs") pod "d4911ee6-6ecd-40da-a12b-f6a79cdaa201" (UID: "d4911ee6-6ecd-40da-a12b-f6a79cdaa201"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.951850 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d4911ee6-6ecd-40da-a12b-f6a79cdaa201" (UID: "d4911ee6-6ecd-40da-a12b-f6a79cdaa201"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.955893 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-kube-api-access-mggmx" (OuterVolumeSpecName: "kube-api-access-mggmx") pod "d4911ee6-6ecd-40da-a12b-f6a79cdaa201" (UID: "d4911ee6-6ecd-40da-a12b-f6a79cdaa201"). InnerVolumeSpecName "kube-api-access-mggmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.973582 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-scripts" (OuterVolumeSpecName: "scripts") pod "d4911ee6-6ecd-40da-a12b-f6a79cdaa201" (UID: "d4911ee6-6ecd-40da-a12b-f6a79cdaa201"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 17:16:52 crc kubenswrapper[5036]: I0110 17:16:52.979410 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-config-data" (OuterVolumeSpecName: "config-data") pod "d4911ee6-6ecd-40da-a12b-f6a79cdaa201" (UID: "d4911ee6-6ecd-40da-a12b-f6a79cdaa201"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.054043 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.054081 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.054094 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mggmx\" (UniqueName: \"kubernetes.io/projected/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-kube-api-access-mggmx\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.054106 5036 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.054117 5036 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4911ee6-6ecd-40da-a12b-f6a79cdaa201-logs\") on node \"crc\" DevicePath \"\"" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.126796 5036 scope.go:117] "RemoveContainer" containerID="899b15ac73aa82209ed544a962d2d1353aa937a6496dffec4bef0aaa7ac9dab4" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.288275 5036 scope.go:117] "RemoveContainer" containerID="067a0e7f5115e9db1753408dad5f266e433a2695546cabe6cd61427335199b49" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.290897 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-77c6f556bf-gmpft"] Jan 10 17:16:53 crc kubenswrapper[5036]: E0110 17:16:53.293943 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"067a0e7f5115e9db1753408dad5f266e433a2695546cabe6cd61427335199b49\": container with ID starting with 067a0e7f5115e9db1753408dad5f266e433a2695546cabe6cd61427335199b49 not found: ID does not exist" containerID="067a0e7f5115e9db1753408dad5f266e433a2695546cabe6cd61427335199b49" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.294000 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"067a0e7f5115e9db1753408dad5f266e433a2695546cabe6cd61427335199b49"} err="failed to get container status \"067a0e7f5115e9db1753408dad5f266e433a2695546cabe6cd61427335199b49\": rpc error: code = NotFound desc = could not find container \"067a0e7f5115e9db1753408dad5f266e433a2695546cabe6cd61427335199b49\": container with ID starting with 067a0e7f5115e9db1753408dad5f266e433a2695546cabe6cd61427335199b49 not found: ID does not exist" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.294024 5036 scope.go:117] "RemoveContainer" containerID="899b15ac73aa82209ed544a962d2d1353aa937a6496dffec4bef0aaa7ac9dab4" Jan 10 17:16:53 crc kubenswrapper[5036]: E0110 17:16:53.294474 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"899b15ac73aa82209ed544a962d2d1353aa937a6496dffec4bef0aaa7ac9dab4\": container with ID starting with 899b15ac73aa82209ed544a962d2d1353aa937a6496dffec4bef0aaa7ac9dab4 not found: ID does not exist" containerID="899b15ac73aa82209ed544a962d2d1353aa937a6496dffec4bef0aaa7ac9dab4" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.294530 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"899b15ac73aa82209ed544a962d2d1353aa937a6496dffec4bef0aaa7ac9dab4"} err="failed to get container status \"899b15ac73aa82209ed544a962d2d1353aa937a6496dffec4bef0aaa7ac9dab4\": rpc error: code = NotFound desc = could not find container \"899b15ac73aa82209ed544a962d2d1353aa937a6496dffec4bef0aaa7ac9dab4\": container with ID starting with 899b15ac73aa82209ed544a962d2d1353aa937a6496dffec4bef0aaa7ac9dab4 not found: ID does not exist" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.294562 5036 scope.go:117] "RemoveContainer" containerID="067a0e7f5115e9db1753408dad5f266e433a2695546cabe6cd61427335199b49" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.295614 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"067a0e7f5115e9db1753408dad5f266e433a2695546cabe6cd61427335199b49"} err="failed to get container status \"067a0e7f5115e9db1753408dad5f266e433a2695546cabe6cd61427335199b49\": rpc error: code = NotFound desc = could not find container \"067a0e7f5115e9db1753408dad5f266e433a2695546cabe6cd61427335199b49\": container with ID starting with 067a0e7f5115e9db1753408dad5f266e433a2695546cabe6cd61427335199b49 not found: ID does not exist" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.295637 5036 scope.go:117] "RemoveContainer" containerID="899b15ac73aa82209ed544a962d2d1353aa937a6496dffec4bef0aaa7ac9dab4" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.295905 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"899b15ac73aa82209ed544a962d2d1353aa937a6496dffec4bef0aaa7ac9dab4"} err="failed to get container status \"899b15ac73aa82209ed544a962d2d1353aa937a6496dffec4bef0aaa7ac9dab4\": rpc error: code = NotFound desc = could not find container \"899b15ac73aa82209ed544a962d2d1353aa937a6496dffec4bef0aaa7ac9dab4\": container with ID starting with 899b15ac73aa82209ed544a962d2d1353aa937a6496dffec4bef0aaa7ac9dab4 not found: ID does not exist" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.295943 5036 scope.go:117] "RemoveContainer" containerID="2f69a058e801b8c19d6ee86a897a2e0c0a251d3c018051982d1e811a6a348011" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.305951 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-77c6f556bf-gmpft"] Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.320710 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.331010 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.336606 5036 scope.go:117] "RemoveContainer" containerID="87f287e589de8e951628d2f6583bc39b1ce289181410c6c1d0d58d4cf4352c9c" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.339275 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 10 17:16:53 crc kubenswrapper[5036]: E0110 17:16:53.339845 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" containerName="ceilometer-central-agent" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.339864 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" containerName="ceilometer-central-agent" Jan 10 17:16:53 crc kubenswrapper[5036]: E0110 17:16:53.339879 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0" containerName="init" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.339888 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0" containerName="init" Jan 10 17:16:53 crc kubenswrapper[5036]: E0110 17:16:53.339899 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4911ee6-6ecd-40da-a12b-f6a79cdaa201" containerName="horizon" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.339907 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4911ee6-6ecd-40da-a12b-f6a79cdaa201" containerName="horizon" Jan 10 17:16:53 crc kubenswrapper[5036]: E0110 17:16:53.339921 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd7a2aaf-b591-4388-aaf0-f94c930032b5" containerName="horizon-log" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.339927 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd7a2aaf-b591-4388-aaf0-f94c930032b5" containerName="horizon-log" Jan 10 17:16:53 crc kubenswrapper[5036]: E0110 17:16:53.339948 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" containerName="ceilometer-notification-agent" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.339953 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" containerName="ceilometer-notification-agent" Jan 10 17:16:53 crc kubenswrapper[5036]: E0110 17:16:53.339964 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0" containerName="dnsmasq-dns" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.339970 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0" containerName="dnsmasq-dns" Jan 10 17:16:53 crc kubenswrapper[5036]: E0110 17:16:53.339978 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" containerName="proxy-httpd" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.339983 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" containerName="proxy-httpd" Jan 10 17:16:53 crc kubenswrapper[5036]: E0110 17:16:53.339995 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4911ee6-6ecd-40da-a12b-f6a79cdaa201" containerName="horizon-log" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.340001 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4911ee6-6ecd-40da-a12b-f6a79cdaa201" containerName="horizon-log" Jan 10 17:16:53 crc kubenswrapper[5036]: E0110 17:16:53.340010 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd7a2aaf-b591-4388-aaf0-f94c930032b5" containerName="horizon" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.340015 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd7a2aaf-b591-4388-aaf0-f94c930032b5" containerName="horizon" Jan 10 17:16:53 crc kubenswrapper[5036]: E0110 17:16:53.340027 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" containerName="sg-core" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.340032 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" containerName="sg-core" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.340200 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" containerName="sg-core" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.340212 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd7a2aaf-b591-4388-aaf0-f94c930032b5" containerName="horizon-log" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.340221 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" containerName="ceilometer-notification-agent" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.340234 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0" containerName="dnsmasq-dns" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.340245 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4911ee6-6ecd-40da-a12b-f6a79cdaa201" containerName="horizon" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.340257 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" containerName="proxy-httpd" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.340264 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd7a2aaf-b591-4388-aaf0-f94c930032b5" containerName="horizon" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.340275 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4911ee6-6ecd-40da-a12b-f6a79cdaa201" containerName="horizon-log" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.340287 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" containerName="ceilometer-central-agent" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.341920 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.346307 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.350623 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.351015 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.351149 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.384221 5036 scope.go:117] "RemoveContainer" containerID="bdb6d471dc67978377d531b907db8ae43857a59da3fc030c32e9911c851c92df" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.410203 5036 scope.go:117] "RemoveContainer" containerID="5489c0ae77ae17200383fd972a712757b379a5d38a6f74632ffa477d87e91865" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.463201 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-scripts\") pod \"ceilometer-0\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.463393 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/febf651c-601b-4bdc-ad24-bbbd48574ea8-run-httpd\") pod \"ceilometer-0\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.463588 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-config-data\") pod \"ceilometer-0\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.463672 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2gjp\" (UniqueName: \"kubernetes.io/projected/febf651c-601b-4bdc-ad24-bbbd48574ea8-kube-api-access-d2gjp\") pod \"ceilometer-0\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.463848 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.463939 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/febf651c-601b-4bdc-ad24-bbbd48574ea8-log-httpd\") pod \"ceilometer-0\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.463979 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.464017 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.565796 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2gjp\" (UniqueName: \"kubernetes.io/projected/febf651c-601b-4bdc-ad24-bbbd48574ea8-kube-api-access-d2gjp\") pod \"ceilometer-0\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.565878 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.565921 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/febf651c-601b-4bdc-ad24-bbbd48574ea8-log-httpd\") pod \"ceilometer-0\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.565948 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.565971 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.566029 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-scripts\") pod \"ceilometer-0\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.566057 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/febf651c-601b-4bdc-ad24-bbbd48574ea8-run-httpd\") pod \"ceilometer-0\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.566092 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-config-data\") pod \"ceilometer-0\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.567640 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/febf651c-601b-4bdc-ad24-bbbd48574ea8-log-httpd\") pod \"ceilometer-0\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.567746 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/febf651c-601b-4bdc-ad24-bbbd48574ea8-run-httpd\") pod \"ceilometer-0\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.571093 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-scripts\") pod \"ceilometer-0\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.579607 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.580550 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-config-data\") pod \"ceilometer-0\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.584132 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.587415 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.602014 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2gjp\" (UniqueName: \"kubernetes.io/projected/febf651c-601b-4bdc-ad24-bbbd48574ea8-kube-api-access-d2gjp\") pod \"ceilometer-0\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.635161 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.635910 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.858898 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"bc52704a-a1a3-4a9f-91d5-05035ea65015","Type":"ContainerStarted","Data":"73be6cac20da0ee991c05d2862bb6d8b54c431a36f1c50d7865940463d6646ed"} Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.859351 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"bc52704a-a1a3-4a9f-91d5-05035ea65015","Type":"ContainerStarted","Data":"230f3b830ef35977f71e4c489c588dcee47d49abf359ba25de7d00e7eed71bb0"} Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.863720 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78fc99bfc7-mctlg" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.887613 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=5.3078733830000004 podStartE2EDuration="14.887598783s" podCreationTimestamp="2026-01-10 17:16:39 +0000 UTC" firstStartedPulling="2026-01-10 17:16:42.608806133 +0000 UTC m=+2924.479041617" lastFinishedPulling="2026-01-10 17:16:52.188531523 +0000 UTC m=+2934.058767017" observedRunningTime="2026-01-10 17:16:53.882779965 +0000 UTC m=+2935.753015469" watchObservedRunningTime="2026-01-10 17:16:53.887598783 +0000 UTC m=+2935.757834277" Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.920313 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-78fc99bfc7-mctlg"] Jan 10 17:16:53 crc kubenswrapper[5036]: I0110 17:16:53.929566 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-78fc99bfc7-mctlg"] Jan 10 17:16:54 crc kubenswrapper[5036]: I0110 17:16:54.150459 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 10 17:16:54 crc kubenswrapper[5036]: W0110 17:16:54.151815 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfebf651c_601b_4bdc_ad24_bbbd48574ea8.slice/crio-4d96fb26b3e940f068629517ac176557a0c07283f3a4e4c6c8293c62f1119f7a WatchSource:0}: Error finding container 4d96fb26b3e940f068629517ac176557a0c07283f3a4e4c6c8293c62f1119f7a: Status 404 returned error can't find the container with id 4d96fb26b3e940f068629517ac176557a0c07283f3a4e4c6c8293c62f1119f7a Jan 10 17:16:54 crc kubenswrapper[5036]: I0110 17:16:54.536577 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7" path="/var/lib/kubelet/pods/2b68cc48-f3ba-47bd-8cee-4e8c0a3798e7/volumes" Jan 10 17:16:54 crc kubenswrapper[5036]: I0110 17:16:54.537497 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0" path="/var/lib/kubelet/pods/a6ce8785-ce6a-4cf4-a5d8-b5f84da029d0/volumes" Jan 10 17:16:54 crc kubenswrapper[5036]: I0110 17:16:54.538723 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4911ee6-6ecd-40da-a12b-f6a79cdaa201" path="/var/lib/kubelet/pods/d4911ee6-6ecd-40da-a12b-f6a79cdaa201/volumes" Jan 10 17:16:54 crc kubenswrapper[5036]: I0110 17:16:54.539420 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd7a2aaf-b591-4388-aaf0-f94c930032b5" path="/var/lib/kubelet/pods/dd7a2aaf-b591-4388-aaf0-f94c930032b5/volumes" Jan 10 17:16:54 crc kubenswrapper[5036]: I0110 17:16:54.901763 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"febf651c-601b-4bdc-ad24-bbbd48574ea8","Type":"ContainerStarted","Data":"4d96fb26b3e940f068629517ac176557a0c07283f3a4e4c6c8293c62f1119f7a"} Jan 10 17:16:55 crc kubenswrapper[5036]: I0110 17:16:55.913558 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"febf651c-601b-4bdc-ad24-bbbd48574ea8","Type":"ContainerStarted","Data":"fb3b2de3cfe17c5bca16987ad741e60ba91bd4182aacd2d95cadc390fa5a4af2"} Jan 10 17:16:56 crc kubenswrapper[5036]: I0110 17:16:56.925155 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"febf651c-601b-4bdc-ad24-bbbd48574ea8","Type":"ContainerStarted","Data":"9d2294717b146cbd2b8b0ab6374b632635aea4b71294b2437e9eed6a6553654d"} Jan 10 17:16:56 crc kubenswrapper[5036]: I0110 17:16:56.925706 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"febf651c-601b-4bdc-ad24-bbbd48574ea8","Type":"ContainerStarted","Data":"16f9da10d39696b5227fcb0dd892cf0faf4ab27401021cf1d467902619b84878"} Jan 10 17:16:56 crc kubenswrapper[5036]: I0110 17:16:56.927331 5036 generic.go:334] "Generic (PLEG): container finished" podID="b7d588d2-de3c-4aa8-9949-cd2cc17beac6" containerID="33f3578f36c2eac18c7decc55a04cf76ce07f0a11f1219e396288278723db621" exitCode=0 Jan 10 17:16:56 crc kubenswrapper[5036]: I0110 17:16:56.927371 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79f74f6ffb-kzjrv" event={"ID":"b7d588d2-de3c-4aa8-9949-cd2cc17beac6","Type":"ContainerDied","Data":"33f3578f36c2eac18c7decc55a04cf76ce07f0a11f1219e396288278723db621"} Jan 10 17:16:57 crc kubenswrapper[5036]: I0110 17:16:57.798135 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-79f74f6ffb-kzjrv" podUID="b7d588d2-de3c-4aa8-9949-cd2cc17beac6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.242:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.242:8443: connect: connection refused" Jan 10 17:16:58 crc kubenswrapper[5036]: I0110 17:16:58.959176 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"febf651c-601b-4bdc-ad24-bbbd48574ea8","Type":"ContainerStarted","Data":"ac3c0ab9e2ed742f6b803953f2724eec344458a41fbbabf735a942b705f67e74"} Jan 10 17:16:58 crc kubenswrapper[5036]: I0110 17:16:58.959510 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="febf651c-601b-4bdc-ad24-bbbd48574ea8" containerName="ceilometer-central-agent" containerID="cri-o://fb3b2de3cfe17c5bca16987ad741e60ba91bd4182aacd2d95cadc390fa5a4af2" gracePeriod=30 Jan 10 17:16:58 crc kubenswrapper[5036]: I0110 17:16:58.959889 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 10 17:16:58 crc kubenswrapper[5036]: I0110 17:16:58.959900 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="febf651c-601b-4bdc-ad24-bbbd48574ea8" containerName="proxy-httpd" containerID="cri-o://ac3c0ab9e2ed742f6b803953f2724eec344458a41fbbabf735a942b705f67e74" gracePeriod=30 Jan 10 17:16:58 crc kubenswrapper[5036]: I0110 17:16:58.959966 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="febf651c-601b-4bdc-ad24-bbbd48574ea8" containerName="sg-core" containerID="cri-o://9d2294717b146cbd2b8b0ab6374b632635aea4b71294b2437e9eed6a6553654d" gracePeriod=30 Jan 10 17:16:58 crc kubenswrapper[5036]: I0110 17:16:58.960019 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="febf651c-601b-4bdc-ad24-bbbd48574ea8" containerName="ceilometer-notification-agent" containerID="cri-o://16f9da10d39696b5227fcb0dd892cf0faf4ab27401021cf1d467902619b84878" gracePeriod=30 Jan 10 17:16:58 crc kubenswrapper[5036]: I0110 17:16:58.987362 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.049081171 podStartE2EDuration="5.987342972s" podCreationTimestamp="2026-01-10 17:16:53 +0000 UTC" firstStartedPulling="2026-01-10 17:16:54.155558795 +0000 UTC m=+2936.025794289" lastFinishedPulling="2026-01-10 17:16:58.093820596 +0000 UTC m=+2939.964056090" observedRunningTime="2026-01-10 17:16:58.98377324 +0000 UTC m=+2940.854008754" watchObservedRunningTime="2026-01-10 17:16:58.987342972 +0000 UTC m=+2940.857578466" Jan 10 17:16:59 crc kubenswrapper[5036]: I0110 17:16:59.976867 5036 generic.go:334] "Generic (PLEG): container finished" podID="febf651c-601b-4bdc-ad24-bbbd48574ea8" containerID="ac3c0ab9e2ed742f6b803953f2724eec344458a41fbbabf735a942b705f67e74" exitCode=0 Jan 10 17:16:59 crc kubenswrapper[5036]: I0110 17:16:59.977237 5036 generic.go:334] "Generic (PLEG): container finished" podID="febf651c-601b-4bdc-ad24-bbbd48574ea8" containerID="9d2294717b146cbd2b8b0ab6374b632635aea4b71294b2437e9eed6a6553654d" exitCode=2 Jan 10 17:16:59 crc kubenswrapper[5036]: I0110 17:16:59.977255 5036 generic.go:334] "Generic (PLEG): container finished" podID="febf651c-601b-4bdc-ad24-bbbd48574ea8" containerID="16f9da10d39696b5227fcb0dd892cf0faf4ab27401021cf1d467902619b84878" exitCode=0 Jan 10 17:16:59 crc kubenswrapper[5036]: I0110 17:16:59.976961 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"febf651c-601b-4bdc-ad24-bbbd48574ea8","Type":"ContainerDied","Data":"ac3c0ab9e2ed742f6b803953f2724eec344458a41fbbabf735a942b705f67e74"} Jan 10 17:16:59 crc kubenswrapper[5036]: I0110 17:16:59.977361 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"febf651c-601b-4bdc-ad24-bbbd48574ea8","Type":"ContainerDied","Data":"9d2294717b146cbd2b8b0ab6374b632635aea4b71294b2437e9eed6a6553654d"} Jan 10 17:16:59 crc kubenswrapper[5036]: I0110 17:16:59.977425 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"febf651c-601b-4bdc-ad24-bbbd48574ea8","Type":"ContainerDied","Data":"16f9da10d39696b5227fcb0dd892cf0faf4ab27401021cf1d467902619b84878"} Jan 10 17:17:00 crc kubenswrapper[5036]: I0110 17:17:00.452554 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Jan 10 17:17:00 crc kubenswrapper[5036]: I0110 17:17:00.987917 5036 generic.go:334] "Generic (PLEG): container finished" podID="febf651c-601b-4bdc-ad24-bbbd48574ea8" containerID="fb3b2de3cfe17c5bca16987ad741e60ba91bd4182aacd2d95cadc390fa5a4af2" exitCode=0 Jan 10 17:17:00 crc kubenswrapper[5036]: I0110 17:17:00.988239 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"febf651c-601b-4bdc-ad24-bbbd48574ea8","Type":"ContainerDied","Data":"fb3b2de3cfe17c5bca16987ad741e60ba91bd4182aacd2d95cadc390fa5a4af2"} Jan 10 17:17:00 crc kubenswrapper[5036]: I0110 17:17:00.988267 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"febf651c-601b-4bdc-ad24-bbbd48574ea8","Type":"ContainerDied","Data":"4d96fb26b3e940f068629517ac176557a0c07283f3a4e4c6c8293c62f1119f7a"} Jan 10 17:17:00 crc kubenswrapper[5036]: I0110 17:17:00.988280 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d96fb26b3e940f068629517ac176557a0c07283f3a4e4c6c8293c62f1119f7a" Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.013333 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.145030 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/febf651c-601b-4bdc-ad24-bbbd48574ea8-log-httpd\") pod \"febf651c-601b-4bdc-ad24-bbbd48574ea8\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.145372 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-scripts\") pod \"febf651c-601b-4bdc-ad24-bbbd48574ea8\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.145414 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-ceilometer-tls-certs\") pod \"febf651c-601b-4bdc-ad24-bbbd48574ea8\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.145476 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-sg-core-conf-yaml\") pod \"febf651c-601b-4bdc-ad24-bbbd48574ea8\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.145533 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/febf651c-601b-4bdc-ad24-bbbd48574ea8-run-httpd\") pod \"febf651c-601b-4bdc-ad24-bbbd48574ea8\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.145553 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/febf651c-601b-4bdc-ad24-bbbd48574ea8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "febf651c-601b-4bdc-ad24-bbbd48574ea8" (UID: "febf651c-601b-4bdc-ad24-bbbd48574ea8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.145624 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-config-data\") pod \"febf651c-601b-4bdc-ad24-bbbd48574ea8\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.145656 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-combined-ca-bundle\") pod \"febf651c-601b-4bdc-ad24-bbbd48574ea8\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.145715 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2gjp\" (UniqueName: \"kubernetes.io/projected/febf651c-601b-4bdc-ad24-bbbd48574ea8-kube-api-access-d2gjp\") pod \"febf651c-601b-4bdc-ad24-bbbd48574ea8\" (UID: \"febf651c-601b-4bdc-ad24-bbbd48574ea8\") " Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.146073 5036 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/febf651c-601b-4bdc-ad24-bbbd48574ea8-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.146774 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/febf651c-601b-4bdc-ad24-bbbd48574ea8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "febf651c-601b-4bdc-ad24-bbbd48574ea8" (UID: "febf651c-601b-4bdc-ad24-bbbd48574ea8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.152877 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-scripts" (OuterVolumeSpecName: "scripts") pod "febf651c-601b-4bdc-ad24-bbbd48574ea8" (UID: "febf651c-601b-4bdc-ad24-bbbd48574ea8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.153039 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/febf651c-601b-4bdc-ad24-bbbd48574ea8-kube-api-access-d2gjp" (OuterVolumeSpecName: "kube-api-access-d2gjp") pod "febf651c-601b-4bdc-ad24-bbbd48574ea8" (UID: "febf651c-601b-4bdc-ad24-bbbd48574ea8"). InnerVolumeSpecName "kube-api-access-d2gjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.186999 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "febf651c-601b-4bdc-ad24-bbbd48574ea8" (UID: "febf651c-601b-4bdc-ad24-bbbd48574ea8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.218600 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "febf651c-601b-4bdc-ad24-bbbd48574ea8" (UID: "febf651c-601b-4bdc-ad24-bbbd48574ea8"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.241408 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "febf651c-601b-4bdc-ad24-bbbd48574ea8" (UID: "febf651c-601b-4bdc-ad24-bbbd48574ea8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.247339 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.247363 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2gjp\" (UniqueName: \"kubernetes.io/projected/febf651c-601b-4bdc-ad24-bbbd48574ea8-kube-api-access-d2gjp\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.247373 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.247381 5036 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.247390 5036 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.247398 5036 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/febf651c-601b-4bdc-ad24-bbbd48574ea8-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.266710 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-config-data" (OuterVolumeSpecName: "config-data") pod "febf651c-601b-4bdc-ad24-bbbd48574ea8" (UID: "febf651c-601b-4bdc-ad24-bbbd48574ea8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.349321 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/febf651c-601b-4bdc-ad24-bbbd48574ea8-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.734845 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.793760 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.997746 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.997813 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="c6edad84-3f9b-45db-bb29-1609bc82b62a" containerName="manila-scheduler" containerID="cri-o://8e09b10bef9a4b51a3ba15d929999eb7610813694259941804d2786a22120d1a" gracePeriod=30 Jan 10 17:17:01 crc kubenswrapper[5036]: I0110 17:17:01.997880 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="c6edad84-3f9b-45db-bb29-1609bc82b62a" containerName="probe" containerID="cri-o://dcfb9ca570fa1c0a9bbc395b73a466d9a14d83f8911a13f26dd45c1f4b84a156" gracePeriod=30 Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.069769 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.128766 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.141746 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 10 17:17:02 crc kubenswrapper[5036]: E0110 17:17:02.142171 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="febf651c-601b-4bdc-ad24-bbbd48574ea8" containerName="proxy-httpd" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.142184 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="febf651c-601b-4bdc-ad24-bbbd48574ea8" containerName="proxy-httpd" Jan 10 17:17:02 crc kubenswrapper[5036]: E0110 17:17:02.142207 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="febf651c-601b-4bdc-ad24-bbbd48574ea8" containerName="sg-core" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.142213 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="febf651c-601b-4bdc-ad24-bbbd48574ea8" containerName="sg-core" Jan 10 17:17:02 crc kubenswrapper[5036]: E0110 17:17:02.142230 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="febf651c-601b-4bdc-ad24-bbbd48574ea8" containerName="ceilometer-notification-agent" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.142237 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="febf651c-601b-4bdc-ad24-bbbd48574ea8" containerName="ceilometer-notification-agent" Jan 10 17:17:02 crc kubenswrapper[5036]: E0110 17:17:02.142251 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="febf651c-601b-4bdc-ad24-bbbd48574ea8" containerName="ceilometer-central-agent" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.142258 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="febf651c-601b-4bdc-ad24-bbbd48574ea8" containerName="ceilometer-central-agent" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.142443 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="febf651c-601b-4bdc-ad24-bbbd48574ea8" containerName="proxy-httpd" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.142456 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="febf651c-601b-4bdc-ad24-bbbd48574ea8" containerName="sg-core" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.142481 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="febf651c-601b-4bdc-ad24-bbbd48574ea8" containerName="ceilometer-central-agent" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.142493 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="febf651c-601b-4bdc-ad24-bbbd48574ea8" containerName="ceilometer-notification-agent" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.144077 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.151192 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.151272 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.152113 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.156180 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.279958 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaeec74d-5c59-4684-81e3-7ca32b833f59-run-httpd\") pod \"ceilometer-0\" (UID: \"eaeec74d-5c59-4684-81e3-7ca32b833f59\") " pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.280111 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5bpx\" (UniqueName: \"kubernetes.io/projected/eaeec74d-5c59-4684-81e3-7ca32b833f59-kube-api-access-l5bpx\") pod \"ceilometer-0\" (UID: \"eaeec74d-5c59-4684-81e3-7ca32b833f59\") " pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.280146 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaeec74d-5c59-4684-81e3-7ca32b833f59-scripts\") pod \"ceilometer-0\" (UID: \"eaeec74d-5c59-4684-81e3-7ca32b833f59\") " pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.280173 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eaeec74d-5c59-4684-81e3-7ca32b833f59-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eaeec74d-5c59-4684-81e3-7ca32b833f59\") " pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.280196 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaeec74d-5c59-4684-81e3-7ca32b833f59-config-data\") pod \"ceilometer-0\" (UID: \"eaeec74d-5c59-4684-81e3-7ca32b833f59\") " pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.280219 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaeec74d-5c59-4684-81e3-7ca32b833f59-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"eaeec74d-5c59-4684-81e3-7ca32b833f59\") " pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.280240 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaeec74d-5c59-4684-81e3-7ca32b833f59-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eaeec74d-5c59-4684-81e3-7ca32b833f59\") " pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.280264 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaeec74d-5c59-4684-81e3-7ca32b833f59-log-httpd\") pod \"ceilometer-0\" (UID: \"eaeec74d-5c59-4684-81e3-7ca32b833f59\") " pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.381332 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaeec74d-5c59-4684-81e3-7ca32b833f59-log-httpd\") pod \"ceilometer-0\" (UID: \"eaeec74d-5c59-4684-81e3-7ca32b833f59\") " pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.381807 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaeec74d-5c59-4684-81e3-7ca32b833f59-run-httpd\") pod \"ceilometer-0\" (UID: \"eaeec74d-5c59-4684-81e3-7ca32b833f59\") " pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.381864 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaeec74d-5c59-4684-81e3-7ca32b833f59-log-httpd\") pod \"ceilometer-0\" (UID: \"eaeec74d-5c59-4684-81e3-7ca32b833f59\") " pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.382062 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5bpx\" (UniqueName: \"kubernetes.io/projected/eaeec74d-5c59-4684-81e3-7ca32b833f59-kube-api-access-l5bpx\") pod \"ceilometer-0\" (UID: \"eaeec74d-5c59-4684-81e3-7ca32b833f59\") " pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.382152 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaeec74d-5c59-4684-81e3-7ca32b833f59-scripts\") pod \"ceilometer-0\" (UID: \"eaeec74d-5c59-4684-81e3-7ca32b833f59\") " pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.382227 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eaeec74d-5c59-4684-81e3-7ca32b833f59-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eaeec74d-5c59-4684-81e3-7ca32b833f59\") " pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.382297 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaeec74d-5c59-4684-81e3-7ca32b833f59-config-data\") pod \"ceilometer-0\" (UID: \"eaeec74d-5c59-4684-81e3-7ca32b833f59\") " pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.382379 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaeec74d-5c59-4684-81e3-7ca32b833f59-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"eaeec74d-5c59-4684-81e3-7ca32b833f59\") " pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.382451 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaeec74d-5c59-4684-81e3-7ca32b833f59-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eaeec74d-5c59-4684-81e3-7ca32b833f59\") " pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.382295 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eaeec74d-5c59-4684-81e3-7ca32b833f59-run-httpd\") pod \"ceilometer-0\" (UID: \"eaeec74d-5c59-4684-81e3-7ca32b833f59\") " pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.385790 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaeec74d-5c59-4684-81e3-7ca32b833f59-scripts\") pod \"ceilometer-0\" (UID: \"eaeec74d-5c59-4684-81e3-7ca32b833f59\") " pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.386251 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaeec74d-5c59-4684-81e3-7ca32b833f59-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"eaeec74d-5c59-4684-81e3-7ca32b833f59\") " pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.386764 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaeec74d-5c59-4684-81e3-7ca32b833f59-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eaeec74d-5c59-4684-81e3-7ca32b833f59\") " pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.393617 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaeec74d-5c59-4684-81e3-7ca32b833f59-config-data\") pod \"ceilometer-0\" (UID: \"eaeec74d-5c59-4684-81e3-7ca32b833f59\") " pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.394798 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eaeec74d-5c59-4684-81e3-7ca32b833f59-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eaeec74d-5c59-4684-81e3-7ca32b833f59\") " pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.400987 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5bpx\" (UniqueName: \"kubernetes.io/projected/eaeec74d-5c59-4684-81e3-7ca32b833f59-kube-api-access-l5bpx\") pod \"ceilometer-0\" (UID: \"eaeec74d-5c59-4684-81e3-7ca32b833f59\") " pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.473896 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.527422 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="febf651c-601b-4bdc-ad24-bbbd48574ea8" path="/var/lib/kubelet/pods/febf651c-601b-4bdc-ad24-bbbd48574ea8/volumes" Jan 10 17:17:02 crc kubenswrapper[5036]: I0110 17:17:02.980592 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 10 17:17:03 crc kubenswrapper[5036]: I0110 17:17:03.008489 5036 generic.go:334] "Generic (PLEG): container finished" podID="c6edad84-3f9b-45db-bb29-1609bc82b62a" containerID="dcfb9ca570fa1c0a9bbc395b73a466d9a14d83f8911a13f26dd45c1f4b84a156" exitCode=0 Jan 10 17:17:03 crc kubenswrapper[5036]: I0110 17:17:03.008575 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"c6edad84-3f9b-45db-bb29-1609bc82b62a","Type":"ContainerDied","Data":"dcfb9ca570fa1c0a9bbc395b73a466d9a14d83f8911a13f26dd45c1f4b84a156"} Jan 10 17:17:03 crc kubenswrapper[5036]: I0110 17:17:03.010636 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaeec74d-5c59-4684-81e3-7ca32b833f59","Type":"ContainerStarted","Data":"6ac1b8d7e6208fe9e6fbc9dfeafcd73e9fb2873cb102a1c7f6055571993540dd"} Jan 10 17:17:03 crc kubenswrapper[5036]: I0110 17:17:03.850731 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.011773 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6edad84-3f9b-45db-bb29-1609bc82b62a-config-data-custom\") pod \"c6edad84-3f9b-45db-bb29-1609bc82b62a\" (UID: \"c6edad84-3f9b-45db-bb29-1609bc82b62a\") " Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.012161 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c6edad84-3f9b-45db-bb29-1609bc82b62a-etc-machine-id\") pod \"c6edad84-3f9b-45db-bb29-1609bc82b62a\" (UID: \"c6edad84-3f9b-45db-bb29-1609bc82b62a\") " Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.012208 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6edad84-3f9b-45db-bb29-1609bc82b62a-config-data\") pod \"c6edad84-3f9b-45db-bb29-1609bc82b62a\" (UID: \"c6edad84-3f9b-45db-bb29-1609bc82b62a\") " Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.012235 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6edad84-3f9b-45db-bb29-1609bc82b62a-combined-ca-bundle\") pod \"c6edad84-3f9b-45db-bb29-1609bc82b62a\" (UID: \"c6edad84-3f9b-45db-bb29-1609bc82b62a\") " Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.012272 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl9tg\" (UniqueName: \"kubernetes.io/projected/c6edad84-3f9b-45db-bb29-1609bc82b62a-kube-api-access-zl9tg\") pod \"c6edad84-3f9b-45db-bb29-1609bc82b62a\" (UID: \"c6edad84-3f9b-45db-bb29-1609bc82b62a\") " Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.012304 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6edad84-3f9b-45db-bb29-1609bc82b62a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c6edad84-3f9b-45db-bb29-1609bc82b62a" (UID: "c6edad84-3f9b-45db-bb29-1609bc82b62a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.012349 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6edad84-3f9b-45db-bb29-1609bc82b62a-scripts\") pod \"c6edad84-3f9b-45db-bb29-1609bc82b62a\" (UID: \"c6edad84-3f9b-45db-bb29-1609bc82b62a\") " Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.012789 5036 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c6edad84-3f9b-45db-bb29-1609bc82b62a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.016911 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6edad84-3f9b-45db-bb29-1609bc82b62a-kube-api-access-zl9tg" (OuterVolumeSpecName: "kube-api-access-zl9tg") pod "c6edad84-3f9b-45db-bb29-1609bc82b62a" (UID: "c6edad84-3f9b-45db-bb29-1609bc82b62a"). InnerVolumeSpecName "kube-api-access-zl9tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.017229 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6edad84-3f9b-45db-bb29-1609bc82b62a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c6edad84-3f9b-45db-bb29-1609bc82b62a" (UID: "c6edad84-3f9b-45db-bb29-1609bc82b62a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.023212 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaeec74d-5c59-4684-81e3-7ca32b833f59","Type":"ContainerStarted","Data":"c64fe58211b322a263c5027bc07ca50d8ac73854bcf503b33bbc4497c3c3f666"} Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.026340 5036 generic.go:334] "Generic (PLEG): container finished" podID="c6edad84-3f9b-45db-bb29-1609bc82b62a" containerID="8e09b10bef9a4b51a3ba15d929999eb7610813694259941804d2786a22120d1a" exitCode=0 Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.026466 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"c6edad84-3f9b-45db-bb29-1609bc82b62a","Type":"ContainerDied","Data":"8e09b10bef9a4b51a3ba15d929999eb7610813694259941804d2786a22120d1a"} Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.026541 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"c6edad84-3f9b-45db-bb29-1609bc82b62a","Type":"ContainerDied","Data":"e9b535553a3805673a0e16f5b6003039d5d747d7ff60f5c476ea1e9d45dcb284"} Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.026637 5036 scope.go:117] "RemoveContainer" containerID="dcfb9ca570fa1c0a9bbc395b73a466d9a14d83f8911a13f26dd45c1f4b84a156" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.026838 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.035873 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6edad84-3f9b-45db-bb29-1609bc82b62a-scripts" (OuterVolumeSpecName: "scripts") pod "c6edad84-3f9b-45db-bb29-1609bc82b62a" (UID: "c6edad84-3f9b-45db-bb29-1609bc82b62a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.095495 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6edad84-3f9b-45db-bb29-1609bc82b62a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6edad84-3f9b-45db-bb29-1609bc82b62a" (UID: "c6edad84-3f9b-45db-bb29-1609bc82b62a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.115254 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6edad84-3f9b-45db-bb29-1609bc82b62a-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.115289 5036 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6edad84-3f9b-45db-bb29-1609bc82b62a-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.115306 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6edad84-3f9b-45db-bb29-1609bc82b62a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.115317 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl9tg\" (UniqueName: \"kubernetes.io/projected/c6edad84-3f9b-45db-bb29-1609bc82b62a-kube-api-access-zl9tg\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.120895 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6edad84-3f9b-45db-bb29-1609bc82b62a-config-data" (OuterVolumeSpecName: "config-data") pod "c6edad84-3f9b-45db-bb29-1609bc82b62a" (UID: "c6edad84-3f9b-45db-bb29-1609bc82b62a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.201650 5036 scope.go:117] "RemoveContainer" containerID="8e09b10bef9a4b51a3ba15d929999eb7610813694259941804d2786a22120d1a" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.217772 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6edad84-3f9b-45db-bb29-1609bc82b62a-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.224659 5036 scope.go:117] "RemoveContainer" containerID="dcfb9ca570fa1c0a9bbc395b73a466d9a14d83f8911a13f26dd45c1f4b84a156" Jan 10 17:17:04 crc kubenswrapper[5036]: E0110 17:17:04.225237 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcfb9ca570fa1c0a9bbc395b73a466d9a14d83f8911a13f26dd45c1f4b84a156\": container with ID starting with dcfb9ca570fa1c0a9bbc395b73a466d9a14d83f8911a13f26dd45c1f4b84a156 not found: ID does not exist" containerID="dcfb9ca570fa1c0a9bbc395b73a466d9a14d83f8911a13f26dd45c1f4b84a156" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.225277 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcfb9ca570fa1c0a9bbc395b73a466d9a14d83f8911a13f26dd45c1f4b84a156"} err="failed to get container status \"dcfb9ca570fa1c0a9bbc395b73a466d9a14d83f8911a13f26dd45c1f4b84a156\": rpc error: code = NotFound desc = could not find container \"dcfb9ca570fa1c0a9bbc395b73a466d9a14d83f8911a13f26dd45c1f4b84a156\": container with ID starting with dcfb9ca570fa1c0a9bbc395b73a466d9a14d83f8911a13f26dd45c1f4b84a156 not found: ID does not exist" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.225301 5036 scope.go:117] "RemoveContainer" containerID="8e09b10bef9a4b51a3ba15d929999eb7610813694259941804d2786a22120d1a" Jan 10 17:17:04 crc kubenswrapper[5036]: E0110 17:17:04.225547 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e09b10bef9a4b51a3ba15d929999eb7610813694259941804d2786a22120d1a\": container with ID starting with 8e09b10bef9a4b51a3ba15d929999eb7610813694259941804d2786a22120d1a not found: ID does not exist" containerID="8e09b10bef9a4b51a3ba15d929999eb7610813694259941804d2786a22120d1a" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.225566 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e09b10bef9a4b51a3ba15d929999eb7610813694259941804d2786a22120d1a"} err="failed to get container status \"8e09b10bef9a4b51a3ba15d929999eb7610813694259941804d2786a22120d1a\": rpc error: code = NotFound desc = could not find container \"8e09b10bef9a4b51a3ba15d929999eb7610813694259941804d2786a22120d1a\": container with ID starting with 8e09b10bef9a4b51a3ba15d929999eb7610813694259941804d2786a22120d1a not found: ID does not exist" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.361057 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.368477 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.403016 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Jan 10 17:17:04 crc kubenswrapper[5036]: E0110 17:17:04.403478 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6edad84-3f9b-45db-bb29-1609bc82b62a" containerName="manila-scheduler" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.403500 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6edad84-3f9b-45db-bb29-1609bc82b62a" containerName="manila-scheduler" Jan 10 17:17:04 crc kubenswrapper[5036]: E0110 17:17:04.403538 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6edad84-3f9b-45db-bb29-1609bc82b62a" containerName="probe" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.403545 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6edad84-3f9b-45db-bb29-1609bc82b62a" containerName="probe" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.403725 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6edad84-3f9b-45db-bb29-1609bc82b62a" containerName="probe" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.403749 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6edad84-3f9b-45db-bb29-1609bc82b62a" containerName="manila-scheduler" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.404953 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.410648 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.411548 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.517099 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6edad84-3f9b-45db-bb29-1609bc82b62a" path="/var/lib/kubelet/pods/c6edad84-3f9b-45db-bb29-1609bc82b62a/volumes" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.521960 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da772573-b489-4f28-85da-5d242835ae61-scripts\") pod \"manila-scheduler-0\" (UID: \"da772573-b489-4f28-85da-5d242835ae61\") " pod="openstack/manila-scheduler-0" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.521999 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da772573-b489-4f28-85da-5d242835ae61-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"da772573-b489-4f28-85da-5d242835ae61\") " pod="openstack/manila-scheduler-0" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.522045 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da772573-b489-4f28-85da-5d242835ae61-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"da772573-b489-4f28-85da-5d242835ae61\") " pod="openstack/manila-scheduler-0" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.522090 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da772573-b489-4f28-85da-5d242835ae61-config-data\") pod \"manila-scheduler-0\" (UID: \"da772573-b489-4f28-85da-5d242835ae61\") " pod="openstack/manila-scheduler-0" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.522131 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj6xj\" (UniqueName: \"kubernetes.io/projected/da772573-b489-4f28-85da-5d242835ae61-kube-api-access-nj6xj\") pod \"manila-scheduler-0\" (UID: \"da772573-b489-4f28-85da-5d242835ae61\") " pod="openstack/manila-scheduler-0" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.522148 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da772573-b489-4f28-85da-5d242835ae61-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"da772573-b489-4f28-85da-5d242835ae61\") " pod="openstack/manila-scheduler-0" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.624106 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da772573-b489-4f28-85da-5d242835ae61-scripts\") pod \"manila-scheduler-0\" (UID: \"da772573-b489-4f28-85da-5d242835ae61\") " pod="openstack/manila-scheduler-0" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.624409 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da772573-b489-4f28-85da-5d242835ae61-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"da772573-b489-4f28-85da-5d242835ae61\") " pod="openstack/manila-scheduler-0" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.624660 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da772573-b489-4f28-85da-5d242835ae61-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"da772573-b489-4f28-85da-5d242835ae61\") " pod="openstack/manila-scheduler-0" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.624907 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da772573-b489-4f28-85da-5d242835ae61-config-data\") pod \"manila-scheduler-0\" (UID: \"da772573-b489-4f28-85da-5d242835ae61\") " pod="openstack/manila-scheduler-0" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.625127 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj6xj\" (UniqueName: \"kubernetes.io/projected/da772573-b489-4f28-85da-5d242835ae61-kube-api-access-nj6xj\") pod \"manila-scheduler-0\" (UID: \"da772573-b489-4f28-85da-5d242835ae61\") " pod="openstack/manila-scheduler-0" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.625307 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da772573-b489-4f28-85da-5d242835ae61-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"da772573-b489-4f28-85da-5d242835ae61\") " pod="openstack/manila-scheduler-0" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.625620 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da772573-b489-4f28-85da-5d242835ae61-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"da772573-b489-4f28-85da-5d242835ae61\") " pod="openstack/manila-scheduler-0" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.628698 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da772573-b489-4f28-85da-5d242835ae61-scripts\") pod \"manila-scheduler-0\" (UID: \"da772573-b489-4f28-85da-5d242835ae61\") " pod="openstack/manila-scheduler-0" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.628770 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da772573-b489-4f28-85da-5d242835ae61-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"da772573-b489-4f28-85da-5d242835ae61\") " pod="openstack/manila-scheduler-0" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.629418 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da772573-b489-4f28-85da-5d242835ae61-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"da772573-b489-4f28-85da-5d242835ae61\") " pod="openstack/manila-scheduler-0" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.630754 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da772573-b489-4f28-85da-5d242835ae61-config-data\") pod \"manila-scheduler-0\" (UID: \"da772573-b489-4f28-85da-5d242835ae61\") " pod="openstack/manila-scheduler-0" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.657929 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj6xj\" (UniqueName: \"kubernetes.io/projected/da772573-b489-4f28-85da-5d242835ae61-kube-api-access-nj6xj\") pod \"manila-scheduler-0\" (UID: \"da772573-b489-4f28-85da-5d242835ae61\") " pod="openstack/manila-scheduler-0" Jan 10 17:17:04 crc kubenswrapper[5036]: I0110 17:17:04.727707 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 10 17:17:05 crc kubenswrapper[5036]: I0110 17:17:05.038844 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaeec74d-5c59-4684-81e3-7ca32b833f59","Type":"ContainerStarted","Data":"a6b2e3e0dabfb15d508b8ec89127997ee0ccdaf309a74325dd1ace2aa501ea58"} Jan 10 17:17:05 crc kubenswrapper[5036]: I0110 17:17:05.245764 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Jan 10 17:17:06 crc kubenswrapper[5036]: I0110 17:17:06.054698 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"da772573-b489-4f28-85da-5d242835ae61","Type":"ContainerStarted","Data":"20916abb579c92987b373d69e168094467a9f112dd706879b58a47c59cee4a1b"} Jan 10 17:17:06 crc kubenswrapper[5036]: I0110 17:17:06.055015 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"da772573-b489-4f28-85da-5d242835ae61","Type":"ContainerStarted","Data":"58c554adcacdd9472480bf374d78e8207179acf12142977f3a45a724181e8af8"} Jan 10 17:17:06 crc kubenswrapper[5036]: I0110 17:17:06.057628 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaeec74d-5c59-4684-81e3-7ca32b833f59","Type":"ContainerStarted","Data":"571cfa65e0de14c18d41c18f310caf202727b8d15b0c5f6a11115bb7e4fa431d"} Jan 10 17:17:07 crc kubenswrapper[5036]: I0110 17:17:07.082092 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"da772573-b489-4f28-85da-5d242835ae61","Type":"ContainerStarted","Data":"a5d458e82f34f64365237d32a9427f9940202133fa1dc1c9fd599792ff20944d"} Jan 10 17:17:07 crc kubenswrapper[5036]: I0110 17:17:07.085977 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eaeec74d-5c59-4684-81e3-7ca32b833f59","Type":"ContainerStarted","Data":"c8aaf394fa62f830384b1e0f31e63519edd9adc89482e1e1f02d2608b8c7fb33"} Jan 10 17:17:07 crc kubenswrapper[5036]: I0110 17:17:07.086146 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 10 17:17:07 crc kubenswrapper[5036]: I0110 17:17:07.104210 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.104187839 podStartE2EDuration="3.104187839s" podCreationTimestamp="2026-01-10 17:17:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 17:17:07.099877056 +0000 UTC m=+2948.970112550" watchObservedRunningTime="2026-01-10 17:17:07.104187839 +0000 UTC m=+2948.974423333" Jan 10 17:17:07 crc kubenswrapper[5036]: I0110 17:17:07.136735 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.504886236 podStartE2EDuration="5.136717908s" podCreationTimestamp="2026-01-10 17:17:02 +0000 UTC" firstStartedPulling="2026-01-10 17:17:02.98150075 +0000 UTC m=+2944.851736254" lastFinishedPulling="2026-01-10 17:17:06.613332432 +0000 UTC m=+2948.483567926" observedRunningTime="2026-01-10 17:17:07.1336306 +0000 UTC m=+2949.003866094" watchObservedRunningTime="2026-01-10 17:17:07.136717908 +0000 UTC m=+2949.006953402" Jan 10 17:17:07 crc kubenswrapper[5036]: I0110 17:17:07.806016 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-79f74f6ffb-kzjrv" podUID="b7d588d2-de3c-4aa8-9949-cd2cc17beac6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.242:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.242:8443: connect: connection refused" Jan 10 17:17:08 crc kubenswrapper[5036]: I0110 17:17:08.056496 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Jan 10 17:17:12 crc kubenswrapper[5036]: I0110 17:17:12.046576 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Jan 10 17:17:12 crc kubenswrapper[5036]: I0110 17:17:12.174518 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Jan 10 17:17:12 crc kubenswrapper[5036]: I0110 17:17:12.174801 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="bc52704a-a1a3-4a9f-91d5-05035ea65015" containerName="manila-share" containerID="cri-o://230f3b830ef35977f71e4c489c588dcee47d49abf359ba25de7d00e7eed71bb0" gracePeriod=30 Jan 10 17:17:12 crc kubenswrapper[5036]: I0110 17:17:12.174976 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="bc52704a-a1a3-4a9f-91d5-05035ea65015" containerName="probe" containerID="cri-o://73be6cac20da0ee991c05d2862bb6d8b54c431a36f1c50d7865940463d6646ed" gracePeriod=30 Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.137165 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.147158 5036 generic.go:334] "Generic (PLEG): container finished" podID="bc52704a-a1a3-4a9f-91d5-05035ea65015" containerID="73be6cac20da0ee991c05d2862bb6d8b54c431a36f1c50d7865940463d6646ed" exitCode=0 Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.147190 5036 generic.go:334] "Generic (PLEG): container finished" podID="bc52704a-a1a3-4a9f-91d5-05035ea65015" containerID="230f3b830ef35977f71e4c489c588dcee47d49abf359ba25de7d00e7eed71bb0" exitCode=1 Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.147206 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"bc52704a-a1a3-4a9f-91d5-05035ea65015","Type":"ContainerDied","Data":"73be6cac20da0ee991c05d2862bb6d8b54c431a36f1c50d7865940463d6646ed"} Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.147295 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"bc52704a-a1a3-4a9f-91d5-05035ea65015","Type":"ContainerDied","Data":"230f3b830ef35977f71e4c489c588dcee47d49abf359ba25de7d00e7eed71bb0"} Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.147319 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"bc52704a-a1a3-4a9f-91d5-05035ea65015","Type":"ContainerDied","Data":"9e1578f3fb7f69deed21d32cf4f979316fb97e0282aa5aff52d4608d6d119aa6"} Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.147227 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.147370 5036 scope.go:117] "RemoveContainer" containerID="73be6cac20da0ee991c05d2862bb6d8b54c431a36f1c50d7865940463d6646ed" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.167304 5036 scope.go:117] "RemoveContainer" containerID="230f3b830ef35977f71e4c489c588dcee47d49abf359ba25de7d00e7eed71bb0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.211055 5036 scope.go:117] "RemoveContainer" containerID="73be6cac20da0ee991c05d2862bb6d8b54c431a36f1c50d7865940463d6646ed" Jan 10 17:17:13 crc kubenswrapper[5036]: E0110 17:17:13.211666 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73be6cac20da0ee991c05d2862bb6d8b54c431a36f1c50d7865940463d6646ed\": container with ID starting with 73be6cac20da0ee991c05d2862bb6d8b54c431a36f1c50d7865940463d6646ed not found: ID does not exist" containerID="73be6cac20da0ee991c05d2862bb6d8b54c431a36f1c50d7865940463d6646ed" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.211737 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73be6cac20da0ee991c05d2862bb6d8b54c431a36f1c50d7865940463d6646ed"} err="failed to get container status \"73be6cac20da0ee991c05d2862bb6d8b54c431a36f1c50d7865940463d6646ed\": rpc error: code = NotFound desc = could not find container \"73be6cac20da0ee991c05d2862bb6d8b54c431a36f1c50d7865940463d6646ed\": container with ID starting with 73be6cac20da0ee991c05d2862bb6d8b54c431a36f1c50d7865940463d6646ed not found: ID does not exist" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.211775 5036 scope.go:117] "RemoveContainer" containerID="230f3b830ef35977f71e4c489c588dcee47d49abf359ba25de7d00e7eed71bb0" Jan 10 17:17:13 crc kubenswrapper[5036]: E0110 17:17:13.212265 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"230f3b830ef35977f71e4c489c588dcee47d49abf359ba25de7d00e7eed71bb0\": container with ID starting with 230f3b830ef35977f71e4c489c588dcee47d49abf359ba25de7d00e7eed71bb0 not found: ID does not exist" containerID="230f3b830ef35977f71e4c489c588dcee47d49abf359ba25de7d00e7eed71bb0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.212320 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"230f3b830ef35977f71e4c489c588dcee47d49abf359ba25de7d00e7eed71bb0"} err="failed to get container status \"230f3b830ef35977f71e4c489c588dcee47d49abf359ba25de7d00e7eed71bb0\": rpc error: code = NotFound desc = could not find container \"230f3b830ef35977f71e4c489c588dcee47d49abf359ba25de7d00e7eed71bb0\": container with ID starting with 230f3b830ef35977f71e4c489c588dcee47d49abf359ba25de7d00e7eed71bb0 not found: ID does not exist" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.212352 5036 scope.go:117] "RemoveContainer" containerID="73be6cac20da0ee991c05d2862bb6d8b54c431a36f1c50d7865940463d6646ed" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.212661 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73be6cac20da0ee991c05d2862bb6d8b54c431a36f1c50d7865940463d6646ed"} err="failed to get container status \"73be6cac20da0ee991c05d2862bb6d8b54c431a36f1c50d7865940463d6646ed\": rpc error: code = NotFound desc = could not find container \"73be6cac20da0ee991c05d2862bb6d8b54c431a36f1c50d7865940463d6646ed\": container with ID starting with 73be6cac20da0ee991c05d2862bb6d8b54c431a36f1c50d7865940463d6646ed not found: ID does not exist" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.212688 5036 scope.go:117] "RemoveContainer" containerID="230f3b830ef35977f71e4c489c588dcee47d49abf359ba25de7d00e7eed71bb0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.213009 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"230f3b830ef35977f71e4c489c588dcee47d49abf359ba25de7d00e7eed71bb0"} err="failed to get container status \"230f3b830ef35977f71e4c489c588dcee47d49abf359ba25de7d00e7eed71bb0\": rpc error: code = NotFound desc = could not find container \"230f3b830ef35977f71e4c489c588dcee47d49abf359ba25de7d00e7eed71bb0\": container with ID starting with 230f3b830ef35977f71e4c489c588dcee47d49abf359ba25de7d00e7eed71bb0 not found: ID does not exist" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.219026 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bc52704a-a1a3-4a9f-91d5-05035ea65015-etc-machine-id\") pod \"bc52704a-a1a3-4a9f-91d5-05035ea65015\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.219106 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc52704a-a1a3-4a9f-91d5-05035ea65015-scripts\") pod \"bc52704a-a1a3-4a9f-91d5-05035ea65015\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.219164 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc52704a-a1a3-4a9f-91d5-05035ea65015-config-data\") pod \"bc52704a-a1a3-4a9f-91d5-05035ea65015\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.219165 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc52704a-a1a3-4a9f-91d5-05035ea65015-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bc52704a-a1a3-4a9f-91d5-05035ea65015" (UID: "bc52704a-a1a3-4a9f-91d5-05035ea65015"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.219186 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/bc52704a-a1a3-4a9f-91d5-05035ea65015-var-lib-manila\") pod \"bc52704a-a1a3-4a9f-91d5-05035ea65015\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.219245 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc52704a-a1a3-4a9f-91d5-05035ea65015-combined-ca-bundle\") pod \"bc52704a-a1a3-4a9f-91d5-05035ea65015\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.219327 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bc52704a-a1a3-4a9f-91d5-05035ea65015-ceph\") pod \"bc52704a-a1a3-4a9f-91d5-05035ea65015\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.219343 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmh5l\" (UniqueName: \"kubernetes.io/projected/bc52704a-a1a3-4a9f-91d5-05035ea65015-kube-api-access-hmh5l\") pod \"bc52704a-a1a3-4a9f-91d5-05035ea65015\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.219422 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bc52704a-a1a3-4a9f-91d5-05035ea65015-config-data-custom\") pod \"bc52704a-a1a3-4a9f-91d5-05035ea65015\" (UID: \"bc52704a-a1a3-4a9f-91d5-05035ea65015\") " Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.219867 5036 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bc52704a-a1a3-4a9f-91d5-05035ea65015-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.219974 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc52704a-a1a3-4a9f-91d5-05035ea65015-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "bc52704a-a1a3-4a9f-91d5-05035ea65015" (UID: "bc52704a-a1a3-4a9f-91d5-05035ea65015"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.226281 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc52704a-a1a3-4a9f-91d5-05035ea65015-kube-api-access-hmh5l" (OuterVolumeSpecName: "kube-api-access-hmh5l") pod "bc52704a-a1a3-4a9f-91d5-05035ea65015" (UID: "bc52704a-a1a3-4a9f-91d5-05035ea65015"). InnerVolumeSpecName "kube-api-access-hmh5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.227772 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc52704a-a1a3-4a9f-91d5-05035ea65015-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bc52704a-a1a3-4a9f-91d5-05035ea65015" (UID: "bc52704a-a1a3-4a9f-91d5-05035ea65015"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.229943 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc52704a-a1a3-4a9f-91d5-05035ea65015-ceph" (OuterVolumeSpecName: "ceph") pod "bc52704a-a1a3-4a9f-91d5-05035ea65015" (UID: "bc52704a-a1a3-4a9f-91d5-05035ea65015"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.247019 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc52704a-a1a3-4a9f-91d5-05035ea65015-scripts" (OuterVolumeSpecName: "scripts") pod "bc52704a-a1a3-4a9f-91d5-05035ea65015" (UID: "bc52704a-a1a3-4a9f-91d5-05035ea65015"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.287533 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc52704a-a1a3-4a9f-91d5-05035ea65015-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc52704a-a1a3-4a9f-91d5-05035ea65015" (UID: "bc52704a-a1a3-4a9f-91d5-05035ea65015"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.321173 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc52704a-a1a3-4a9f-91d5-05035ea65015-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.321210 5036 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/bc52704a-a1a3-4a9f-91d5-05035ea65015-var-lib-manila\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.321219 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc52704a-a1a3-4a9f-91d5-05035ea65015-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.321232 5036 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/bc52704a-a1a3-4a9f-91d5-05035ea65015-ceph\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.321241 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmh5l\" (UniqueName: \"kubernetes.io/projected/bc52704a-a1a3-4a9f-91d5-05035ea65015-kube-api-access-hmh5l\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.321250 5036 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bc52704a-a1a3-4a9f-91d5-05035ea65015-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.327541 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc52704a-a1a3-4a9f-91d5-05035ea65015-config-data" (OuterVolumeSpecName: "config-data") pod "bc52704a-a1a3-4a9f-91d5-05035ea65015" (UID: "bc52704a-a1a3-4a9f-91d5-05035ea65015"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.423063 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc52704a-a1a3-4a9f-91d5-05035ea65015-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.490638 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.531522 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.549676 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Jan 10 17:17:13 crc kubenswrapper[5036]: E0110 17:17:13.552573 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc52704a-a1a3-4a9f-91d5-05035ea65015" containerName="probe" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.552590 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc52704a-a1a3-4a9f-91d5-05035ea65015" containerName="probe" Jan 10 17:17:13 crc kubenswrapper[5036]: E0110 17:17:13.552613 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc52704a-a1a3-4a9f-91d5-05035ea65015" containerName="manila-share" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.552620 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc52704a-a1a3-4a9f-91d5-05035ea65015" containerName="manila-share" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.552811 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc52704a-a1a3-4a9f-91d5-05035ea65015" containerName="manila-share" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.552835 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc52704a-a1a3-4a9f-91d5-05035ea65015" containerName="probe" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.553867 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.556261 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.559683 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.646539 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c\") " pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.646612 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c\") " pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.646814 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c-config-data\") pod \"manila-share-share1-0\" (UID: \"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c\") " pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.646916 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf7t8\" (UniqueName: \"kubernetes.io/projected/ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c-kube-api-access-bf7t8\") pod \"manila-share-share1-0\" (UID: \"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c\") " pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.647047 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c-scripts\") pod \"manila-share-share1-0\" (UID: \"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c\") " pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.647069 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c\") " pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.647108 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c-ceph\") pod \"manila-share-share1-0\" (UID: \"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c\") " pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.647145 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c\") " pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.749575 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c-scripts\") pod \"manila-share-share1-0\" (UID: \"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c\") " pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.749639 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c\") " pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.749685 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c-ceph\") pod \"manila-share-share1-0\" (UID: \"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c\") " pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.749747 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c\") " pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.749854 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c\") " pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.749916 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c\") " pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.750064 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c-config-data\") pod \"manila-share-share1-0\" (UID: \"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c\") " pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.749846 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c\") " pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.750294 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf7t8\" (UniqueName: \"kubernetes.io/projected/ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c-kube-api-access-bf7t8\") pod \"manila-share-share1-0\" (UID: \"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c\") " pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.750461 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c\") " pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.753848 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c-ceph\") pod \"manila-share-share1-0\" (UID: \"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c\") " pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.753900 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c\") " pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.754285 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c-config-data\") pod \"manila-share-share1-0\" (UID: \"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c\") " pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.754478 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c\") " pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.754973 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c-scripts\") pod \"manila-share-share1-0\" (UID: \"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c\") " pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.770636 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf7t8\" (UniqueName: \"kubernetes.io/projected/ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c-kube-api-access-bf7t8\") pod \"manila-share-share1-0\" (UID: \"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c\") " pod="openstack/manila-share-share1-0" Jan 10 17:17:13 crc kubenswrapper[5036]: I0110 17:17:13.888513 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 10 17:17:14 crc kubenswrapper[5036]: I0110 17:17:14.519530 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc52704a-a1a3-4a9f-91d5-05035ea65015" path="/var/lib/kubelet/pods/bc52704a-a1a3-4a9f-91d5-05035ea65015/volumes" Jan 10 17:17:14 crc kubenswrapper[5036]: I0110 17:17:14.521518 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Jan 10 17:17:14 crc kubenswrapper[5036]: I0110 17:17:14.727847 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Jan 10 17:17:15 crc kubenswrapper[5036]: I0110 17:17:15.167439 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c","Type":"ContainerStarted","Data":"51bdfcde0e8201adbe4379ed929a1068a2b7b1b278106ac45f3b30ea4f433c3a"} Jan 10 17:17:15 crc kubenswrapper[5036]: I0110 17:17:15.167739 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c","Type":"ContainerStarted","Data":"6f5a87c31659ad6c8e6b2ff2ddbd62a650b9cd48b8f746ae739d5290371af2bb"} Jan 10 17:17:16 crc kubenswrapper[5036]: I0110 17:17:16.183640 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c","Type":"ContainerStarted","Data":"d14a104524316de6288f1d64dc25a2b16ea5c45b11367cd899b2a594ea6a0a8f"} Jan 10 17:17:16 crc kubenswrapper[5036]: I0110 17:17:16.207457 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.207440333 podStartE2EDuration="3.207440333s" podCreationTimestamp="2026-01-10 17:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 17:17:16.200190786 +0000 UTC m=+2958.070426290" watchObservedRunningTime="2026-01-10 17:17:16.207440333 +0000 UTC m=+2958.077675827" Jan 10 17:17:17 crc kubenswrapper[5036]: I0110 17:17:17.798642 5036 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-79f74f6ffb-kzjrv" podUID="b7d588d2-de3c-4aa8-9949-cd2cc17beac6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.242:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.242:8443: connect: connection refused" Jan 10 17:17:17 crc kubenswrapper[5036]: I0110 17:17:17.799132 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:17:23 crc kubenswrapper[5036]: E0110 17:17:23.156601 5036 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7d588d2_de3c_4aa8_9949_cd2cc17beac6.slice/crio-conmon-c47503d9d239a27615fac186d4db643215e81fe8554510b4c3096096c664f607.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfebf651c_601b_4bdc_ad24_bbbd48574ea8.slice/crio-4d96fb26b3e940f068629517ac176557a0c07283f3a4e4c6c8293c62f1119f7a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7d588d2_de3c_4aa8_9949_cd2cc17beac6.slice/crio-c47503d9d239a27615fac186d4db643215e81fe8554510b4c3096096c664f607.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc52704a_a1a3_4a9f_91d5_05035ea65015.slice/crio-230f3b830ef35977f71e4c489c588dcee47d49abf359ba25de7d00e7eed71bb0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc52704a_a1a3_4a9f_91d5_05035ea65015.slice/crio-conmon-230f3b830ef35977f71e4c489c588dcee47d49abf359ba25de7d00e7eed71bb0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc52704a_a1a3_4a9f_91d5_05035ea65015.slice/crio-conmon-73be6cac20da0ee991c05d2862bb6d8b54c431a36f1c50d7865940463d6646ed.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc52704a_a1a3_4a9f_91d5_05035ea65015.slice/crio-73be6cac20da0ee991c05d2862bb6d8b54c431a36f1c50d7865940463d6646ed.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc52704a_a1a3_4a9f_91d5_05035ea65015.slice\": RecentStats: unable to find data in memory cache]" Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.258290 5036 generic.go:334] "Generic (PLEG): container finished" podID="b7d588d2-de3c-4aa8-9949-cd2cc17beac6" containerID="c47503d9d239a27615fac186d4db643215e81fe8554510b4c3096096c664f607" exitCode=137 Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.258504 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79f74f6ffb-kzjrv" event={"ID":"b7d588d2-de3c-4aa8-9949-cd2cc17beac6","Type":"ContainerDied","Data":"c47503d9d239a27615fac186d4db643215e81fe8554510b4c3096096c664f607"} Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.258733 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79f74f6ffb-kzjrv" event={"ID":"b7d588d2-de3c-4aa8-9949-cd2cc17beac6","Type":"ContainerDied","Data":"76735ad99e29f1c250c21ab7a41923dfba7b3779471e4e7ed76b0d99da909fef"} Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.258758 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76735ad99e29f1c250c21ab7a41923dfba7b3779471e4e7ed76b0d99da909fef" Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.348857 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.469869 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-combined-ca-bundle\") pod \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.470016 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-horizon-tls-certs\") pod \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.470085 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-scripts\") pod \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.470249 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhqqj\" (UniqueName: \"kubernetes.io/projected/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-kube-api-access-bhqqj\") pod \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.470384 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-logs\") pod \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.470577 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-horizon-secret-key\") pod \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.470753 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-config-data\") pod \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\" (UID: \"b7d588d2-de3c-4aa8-9949-cd2cc17beac6\") " Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.471100 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-logs" (OuterVolumeSpecName: "logs") pod "b7d588d2-de3c-4aa8-9949-cd2cc17beac6" (UID: "b7d588d2-de3c-4aa8-9949-cd2cc17beac6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.471600 5036 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-logs\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.477973 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b7d588d2-de3c-4aa8-9949-cd2cc17beac6" (UID: "b7d588d2-de3c-4aa8-9949-cd2cc17beac6"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.482773 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-kube-api-access-bhqqj" (OuterVolumeSpecName: "kube-api-access-bhqqj") pod "b7d588d2-de3c-4aa8-9949-cd2cc17beac6" (UID: "b7d588d2-de3c-4aa8-9949-cd2cc17beac6"). InnerVolumeSpecName "kube-api-access-bhqqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.503800 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-scripts" (OuterVolumeSpecName: "scripts") pod "b7d588d2-de3c-4aa8-9949-cd2cc17beac6" (UID: "b7d588d2-de3c-4aa8-9949-cd2cc17beac6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.507472 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-config-data" (OuterVolumeSpecName: "config-data") pod "b7d588d2-de3c-4aa8-9949-cd2cc17beac6" (UID: "b7d588d2-de3c-4aa8-9949-cd2cc17beac6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.527573 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7d588d2-de3c-4aa8-9949-cd2cc17beac6" (UID: "b7d588d2-de3c-4aa8-9949-cd2cc17beac6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.560092 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "b7d588d2-de3c-4aa8-9949-cd2cc17beac6" (UID: "b7d588d2-de3c-4aa8-9949-cd2cc17beac6"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.576983 5036 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.577037 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.577062 5036 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.577086 5036 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.577111 5036 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-scripts\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.577134 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhqqj\" (UniqueName: \"kubernetes.io/projected/b7d588d2-de3c-4aa8-9949-cd2cc17beac6-kube-api-access-bhqqj\") on node \"crc\" DevicePath \"\"" Jan 10 17:17:23 crc kubenswrapper[5036]: I0110 17:17:23.889264 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Jan 10 17:17:24 crc kubenswrapper[5036]: I0110 17:17:24.269942 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79f74f6ffb-kzjrv" Jan 10 17:17:24 crc kubenswrapper[5036]: I0110 17:17:24.325881 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79f74f6ffb-kzjrv"] Jan 10 17:17:24 crc kubenswrapper[5036]: I0110 17:17:24.339785 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-79f74f6ffb-kzjrv"] Jan 10 17:17:24 crc kubenswrapper[5036]: I0110 17:17:24.527509 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7d588d2-de3c-4aa8-9949-cd2cc17beac6" path="/var/lib/kubelet/pods/b7d588d2-de3c-4aa8-9949-cd2cc17beac6/volumes" Jan 10 17:17:26 crc kubenswrapper[5036]: I0110 17:17:26.343487 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Jan 10 17:17:32 crc kubenswrapper[5036]: I0110 17:17:32.489554 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 10 17:17:33 crc kubenswrapper[5036]: E0110 17:17:33.423144 5036 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfebf651c_601b_4bdc_ad24_bbbd48574ea8.slice/crio-4d96fb26b3e940f068629517ac176557a0c07283f3a4e4c6c8293c62f1119f7a\": RecentStats: unable to find data in memory cache]" Jan 10 17:17:35 crc kubenswrapper[5036]: I0110 17:17:35.466179 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Jan 10 17:17:43 crc kubenswrapper[5036]: E0110 17:17:43.712978 5036 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfebf651c_601b_4bdc_ad24_bbbd48574ea8.slice/crio-4d96fb26b3e940f068629517ac176557a0c07283f3a4e4c6c8293c62f1119f7a\": RecentStats: unable to find data in memory cache]" Jan 10 17:17:54 crc kubenswrapper[5036]: E0110 17:17:54.049620 5036 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfebf651c_601b_4bdc_ad24_bbbd48574ea8.slice/crio-4d96fb26b3e940f068629517ac176557a0c07283f3a4e4c6c8293c62f1119f7a\": RecentStats: unable to find data in memory cache]" Jan 10 17:18:06 crc kubenswrapper[5036]: E0110 17:18:06.421211 5036 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.83:55690->38.102.83.83:37657: read tcp 38.102.83.83:55690->38.102.83.83:37657: read: connection reset by peer Jan 10 17:18:06 crc kubenswrapper[5036]: E0110 17:18:06.421279 5036 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.83:55690->38.102.83.83:37657: write tcp 38.102.83.83:55690->38.102.83.83:37657: write: broken pipe Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.459027 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Jan 10 17:18:40 crc kubenswrapper[5036]: E0110 17:18:40.460711 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d588d2-de3c-4aa8-9949-cd2cc17beac6" containerName="horizon" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.460738 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d588d2-de3c-4aa8-9949-cd2cc17beac6" containerName="horizon" Jan 10 17:18:40 crc kubenswrapper[5036]: E0110 17:18:40.460784 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d588d2-de3c-4aa8-9949-cd2cc17beac6" containerName="horizon-log" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.460798 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d588d2-de3c-4aa8-9949-cd2cc17beac6" containerName="horizon-log" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.461127 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d588d2-de3c-4aa8-9949-cd2cc17beac6" containerName="horizon-log" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.461155 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d588d2-de3c-4aa8-9949-cd2cc17beac6" containerName="horizon" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.463836 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.467358 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.467560 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.467565 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.467979 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-2frgm" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.474924 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.510015 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d98e00b3-6224-462a-abe0-52e09ac44fb8-config-data\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.510204 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d98e00b3-6224-462a-abe0-52e09ac44fb8-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.510297 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d98e00b3-6224-462a-abe0-52e09ac44fb8-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.510343 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d98e00b3-6224-462a-abe0-52e09ac44fb8-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.510416 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d98e00b3-6224-462a-abe0-52e09ac44fb8-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.510560 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2gnp\" (UniqueName: \"kubernetes.io/projected/d98e00b3-6224-462a-abe0-52e09ac44fb8-kube-api-access-w2gnp\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.510601 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d98e00b3-6224-462a-abe0-52e09ac44fb8-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.510671 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.510866 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d98e00b3-6224-462a-abe0-52e09ac44fb8-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.613184 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d98e00b3-6224-462a-abe0-52e09ac44fb8-config-data\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.613283 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d98e00b3-6224-462a-abe0-52e09ac44fb8-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.613315 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d98e00b3-6224-462a-abe0-52e09ac44fb8-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.613348 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d98e00b3-6224-462a-abe0-52e09ac44fb8-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.613377 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d98e00b3-6224-462a-abe0-52e09ac44fb8-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.613429 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2gnp\" (UniqueName: \"kubernetes.io/projected/d98e00b3-6224-462a-abe0-52e09ac44fb8-kube-api-access-w2gnp\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.613452 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d98e00b3-6224-462a-abe0-52e09ac44fb8-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.613483 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.613552 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d98e00b3-6224-462a-abe0-52e09ac44fb8-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.613766 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d98e00b3-6224-462a-abe0-52e09ac44fb8-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.614047 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d98e00b3-6224-462a-abe0-52e09ac44fb8-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.614358 5036 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.614913 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d98e00b3-6224-462a-abe0-52e09ac44fb8-config-data\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.615056 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d98e00b3-6224-462a-abe0-52e09ac44fb8-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.620989 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d98e00b3-6224-462a-abe0-52e09ac44fb8-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.621177 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d98e00b3-6224-462a-abe0-52e09ac44fb8-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.625570 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d98e00b3-6224-462a-abe0-52e09ac44fb8-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.637142 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2gnp\" (UniqueName: \"kubernetes.io/projected/d98e00b3-6224-462a-abe0-52e09ac44fb8-kube-api-access-w2gnp\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.658904 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " pod="openstack/tempest-tests-tempest" Jan 10 17:18:40 crc kubenswrapper[5036]: I0110 17:18:40.801598 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 10 17:18:41 crc kubenswrapper[5036]: I0110 17:18:41.322179 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 10 17:18:42 crc kubenswrapper[5036]: I0110 17:18:42.261121 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d98e00b3-6224-462a-abe0-52e09ac44fb8","Type":"ContainerStarted","Data":"efe08ff2da51b6fc3bede9a4ea9c6a4423ad62626df2d765702e24b4cdd730e1"} Jan 10 17:18:55 crc kubenswrapper[5036]: I0110 17:18:55.904115 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 17:18:55 crc kubenswrapper[5036]: I0110 17:18:55.904811 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 17:19:14 crc kubenswrapper[5036]: E0110 17:19:14.215159 5036 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Jan 10 17:19:14 crc kubenswrapper[5036]: E0110 17:19:14.216183 5036 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w2gnp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(d98e00b3-6224-462a-abe0-52e09ac44fb8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 10 17:19:14 crc kubenswrapper[5036]: E0110 17:19:14.217564 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="d98e00b3-6224-462a-abe0-52e09ac44fb8" Jan 10 17:19:14 crc kubenswrapper[5036]: E0110 17:19:14.738909 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="d98e00b3-6224-462a-abe0-52e09ac44fb8" Jan 10 17:19:25 crc kubenswrapper[5036]: I0110 17:19:25.903890 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 17:19:25 crc kubenswrapper[5036]: I0110 17:19:25.904511 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 17:19:28 crc kubenswrapper[5036]: I0110 17:19:28.037433 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 10 17:19:29 crc kubenswrapper[5036]: I0110 17:19:29.898074 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d98e00b3-6224-462a-abe0-52e09ac44fb8","Type":"ContainerStarted","Data":"dfc18f8e25aba213b02efdda70f8cb0c8d1b667257c3992acd9750f29e60322d"} Jan 10 17:19:29 crc kubenswrapper[5036]: I0110 17:19:29.933026 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.210587136 podStartE2EDuration="50.932998635s" podCreationTimestamp="2026-01-10 17:18:39 +0000 UTC" firstStartedPulling="2026-01-10 17:18:41.311541975 +0000 UTC m=+3043.181777489" lastFinishedPulling="2026-01-10 17:19:28.033953464 +0000 UTC m=+3089.904188988" observedRunningTime="2026-01-10 17:19:29.921967231 +0000 UTC m=+3091.792202735" watchObservedRunningTime="2026-01-10 17:19:29.932998635 +0000 UTC m=+3091.803234159" Jan 10 17:19:55 crc kubenswrapper[5036]: I0110 17:19:55.903988 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 17:19:55 crc kubenswrapper[5036]: I0110 17:19:55.904602 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 17:19:55 crc kubenswrapper[5036]: I0110 17:19:55.904661 5036 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 17:19:55 crc kubenswrapper[5036]: I0110 17:19:55.905650 5036 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4"} pod="openshift-machine-config-operator/machine-config-daemon-kqphb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 10 17:19:55 crc kubenswrapper[5036]: I0110 17:19:55.905753 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" containerID="cri-o://7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" gracePeriod=600 Jan 10 17:19:56 crc kubenswrapper[5036]: E0110 17:19:56.032067 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:19:56 crc kubenswrapper[5036]: I0110 17:19:56.202565 5036 generic.go:334] "Generic (PLEG): container finished" podID="79756361-741e-4470-831b-6ee092bc6277" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" exitCode=0 Jan 10 17:19:56 crc kubenswrapper[5036]: I0110 17:19:56.202644 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerDied","Data":"7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4"} Jan 10 17:19:56 crc kubenswrapper[5036]: I0110 17:19:56.202711 5036 scope.go:117] "RemoveContainer" containerID="e1e211d00f0a3d2cccd996d6fd957c8fef52f7908e7b7faa418a6b65ea4298f3" Jan 10 17:19:56 crc kubenswrapper[5036]: I0110 17:19:56.203596 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:19:56 crc kubenswrapper[5036]: E0110 17:19:56.205562 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:20:11 crc kubenswrapper[5036]: I0110 17:20:11.509399 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:20:11 crc kubenswrapper[5036]: E0110 17:20:11.510419 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:20:23 crc kubenswrapper[5036]: I0110 17:20:23.507928 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:20:23 crc kubenswrapper[5036]: E0110 17:20:23.508528 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:20:37 crc kubenswrapper[5036]: I0110 17:20:37.508405 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:20:37 crc kubenswrapper[5036]: E0110 17:20:37.510190 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:20:50 crc kubenswrapper[5036]: I0110 17:20:50.508912 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:20:50 crc kubenswrapper[5036]: E0110 17:20:50.509752 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:20:59 crc kubenswrapper[5036]: I0110 17:20:59.586220 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-srt8f"] Jan 10 17:20:59 crc kubenswrapper[5036]: I0110 17:20:59.588795 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-srt8f" Jan 10 17:20:59 crc kubenswrapper[5036]: I0110 17:20:59.627435 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-srt8f"] Jan 10 17:20:59 crc kubenswrapper[5036]: I0110 17:20:59.650870 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4-catalog-content\") pod \"certified-operators-srt8f\" (UID: \"ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4\") " pod="openshift-marketplace/certified-operators-srt8f" Jan 10 17:20:59 crc kubenswrapper[5036]: I0110 17:20:59.650977 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6bzd\" (UniqueName: \"kubernetes.io/projected/ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4-kube-api-access-f6bzd\") pod \"certified-operators-srt8f\" (UID: \"ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4\") " pod="openshift-marketplace/certified-operators-srt8f" Jan 10 17:20:59 crc kubenswrapper[5036]: I0110 17:20:59.651239 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4-utilities\") pod \"certified-operators-srt8f\" (UID: \"ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4\") " pod="openshift-marketplace/certified-operators-srt8f" Jan 10 17:20:59 crc kubenswrapper[5036]: I0110 17:20:59.753414 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4-catalog-content\") pod \"certified-operators-srt8f\" (UID: \"ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4\") " pod="openshift-marketplace/certified-operators-srt8f" Jan 10 17:20:59 crc kubenswrapper[5036]: I0110 17:20:59.753520 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6bzd\" (UniqueName: \"kubernetes.io/projected/ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4-kube-api-access-f6bzd\") pod \"certified-operators-srt8f\" (UID: \"ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4\") " pod="openshift-marketplace/certified-operators-srt8f" Jan 10 17:20:59 crc kubenswrapper[5036]: I0110 17:20:59.753606 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4-utilities\") pod \"certified-operators-srt8f\" (UID: \"ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4\") " pod="openshift-marketplace/certified-operators-srt8f" Jan 10 17:20:59 crc kubenswrapper[5036]: I0110 17:20:59.754209 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4-catalog-content\") pod \"certified-operators-srt8f\" (UID: \"ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4\") " pod="openshift-marketplace/certified-operators-srt8f" Jan 10 17:20:59 crc kubenswrapper[5036]: I0110 17:20:59.754256 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4-utilities\") pod \"certified-operators-srt8f\" (UID: \"ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4\") " pod="openshift-marketplace/certified-operators-srt8f" Jan 10 17:20:59 crc kubenswrapper[5036]: I0110 17:20:59.783743 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6bzd\" (UniqueName: \"kubernetes.io/projected/ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4-kube-api-access-f6bzd\") pod \"certified-operators-srt8f\" (UID: \"ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4\") " pod="openshift-marketplace/certified-operators-srt8f" Jan 10 17:20:59 crc kubenswrapper[5036]: I0110 17:20:59.907109 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-srt8f" Jan 10 17:21:00 crc kubenswrapper[5036]: I0110 17:21:00.640813 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-srt8f"] Jan 10 17:21:00 crc kubenswrapper[5036]: W0110 17:21:00.655415 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff7c332e_cd66_47cb_ab4d_fccf3d80c1b4.slice/crio-1b61fb8cb267ca3f425f7a90dbb714789e7836c5ff75c94a0a934c937ac3e710 WatchSource:0}: Error finding container 1b61fb8cb267ca3f425f7a90dbb714789e7836c5ff75c94a0a934c937ac3e710: Status 404 returned error can't find the container with id 1b61fb8cb267ca3f425f7a90dbb714789e7836c5ff75c94a0a934c937ac3e710 Jan 10 17:21:00 crc kubenswrapper[5036]: I0110 17:21:00.837626 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srt8f" event={"ID":"ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4","Type":"ContainerStarted","Data":"32f2de29a6d076acbb783a56b967b6703fbcf7c71e42e23e3ee8170aee82fca5"} Jan 10 17:21:00 crc kubenswrapper[5036]: I0110 17:21:00.837668 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srt8f" event={"ID":"ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4","Type":"ContainerStarted","Data":"1b61fb8cb267ca3f425f7a90dbb714789e7836c5ff75c94a0a934c937ac3e710"} Jan 10 17:21:01 crc kubenswrapper[5036]: I0110 17:21:01.847949 5036 generic.go:334] "Generic (PLEG): container finished" podID="ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4" containerID="32f2de29a6d076acbb783a56b967b6703fbcf7c71e42e23e3ee8170aee82fca5" exitCode=0 Jan 10 17:21:01 crc kubenswrapper[5036]: I0110 17:21:01.848275 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srt8f" event={"ID":"ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4","Type":"ContainerDied","Data":"32f2de29a6d076acbb783a56b967b6703fbcf7c71e42e23e3ee8170aee82fca5"} Jan 10 17:21:02 crc kubenswrapper[5036]: I0110 17:21:02.858988 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srt8f" event={"ID":"ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4","Type":"ContainerStarted","Data":"4216c5de7359a36414b0ab9ddc1d3bef78a3ab8156cf44e9f0b4694548a5e9ec"} Jan 10 17:21:03 crc kubenswrapper[5036]: I0110 17:21:03.872112 5036 generic.go:334] "Generic (PLEG): container finished" podID="ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4" containerID="4216c5de7359a36414b0ab9ddc1d3bef78a3ab8156cf44e9f0b4694548a5e9ec" exitCode=0 Jan 10 17:21:03 crc kubenswrapper[5036]: I0110 17:21:03.872190 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srt8f" event={"ID":"ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4","Type":"ContainerDied","Data":"4216c5de7359a36414b0ab9ddc1d3bef78a3ab8156cf44e9f0b4694548a5e9ec"} Jan 10 17:21:04 crc kubenswrapper[5036]: I0110 17:21:04.509180 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:21:04 crc kubenswrapper[5036]: E0110 17:21:04.509389 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:21:04 crc kubenswrapper[5036]: I0110 17:21:04.881801 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srt8f" event={"ID":"ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4","Type":"ContainerStarted","Data":"2c0af834de375a4065cb5db77fef29566f89676be58e819e1e45b3200e9b6356"} Jan 10 17:21:04 crc kubenswrapper[5036]: I0110 17:21:04.906230 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-srt8f" podStartSLOduration=3.421398023 podStartE2EDuration="5.906206323s" podCreationTimestamp="2026-01-10 17:20:59 +0000 UTC" firstStartedPulling="2026-01-10 17:21:01.850477237 +0000 UTC m=+3183.720712741" lastFinishedPulling="2026-01-10 17:21:04.335285547 +0000 UTC m=+3186.205521041" observedRunningTime="2026-01-10 17:21:04.90575172 +0000 UTC m=+3186.775987224" watchObservedRunningTime="2026-01-10 17:21:04.906206323 +0000 UTC m=+3186.776441817" Jan 10 17:21:09 crc kubenswrapper[5036]: I0110 17:21:09.908963 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-srt8f" Jan 10 17:21:09 crc kubenswrapper[5036]: I0110 17:21:09.909716 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-srt8f" Jan 10 17:21:10 crc kubenswrapper[5036]: I0110 17:21:10.015572 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-srt8f" Jan 10 17:21:10 crc kubenswrapper[5036]: I0110 17:21:10.117579 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-srt8f" Jan 10 17:21:10 crc kubenswrapper[5036]: I0110 17:21:10.258159 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-srt8f"] Jan 10 17:21:12 crc kubenswrapper[5036]: I0110 17:21:12.065443 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-srt8f" podUID="ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4" containerName="registry-server" containerID="cri-o://2c0af834de375a4065cb5db77fef29566f89676be58e819e1e45b3200e9b6356" gracePeriod=2 Jan 10 17:21:12 crc kubenswrapper[5036]: I0110 17:21:12.822881 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-srt8f" Jan 10 17:21:12 crc kubenswrapper[5036]: I0110 17:21:12.944931 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4-utilities\") pod \"ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4\" (UID: \"ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4\") " Jan 10 17:21:12 crc kubenswrapper[5036]: I0110 17:21:12.944997 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4-catalog-content\") pod \"ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4\" (UID: \"ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4\") " Jan 10 17:21:12 crc kubenswrapper[5036]: I0110 17:21:12.945132 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6bzd\" (UniqueName: \"kubernetes.io/projected/ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4-kube-api-access-f6bzd\") pod \"ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4\" (UID: \"ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4\") " Jan 10 17:21:12 crc kubenswrapper[5036]: I0110 17:21:12.946096 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4-utilities" (OuterVolumeSpecName: "utilities") pod "ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4" (UID: "ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:21:12 crc kubenswrapper[5036]: I0110 17:21:12.952972 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4-kube-api-access-f6bzd" (OuterVolumeSpecName: "kube-api-access-f6bzd") pod "ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4" (UID: "ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4"). InnerVolumeSpecName "kube-api-access-f6bzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:21:12 crc kubenswrapper[5036]: I0110 17:21:12.992155 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4" (UID: "ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:21:13 crc kubenswrapper[5036]: I0110 17:21:13.047655 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 17:21:13 crc kubenswrapper[5036]: I0110 17:21:13.047698 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 17:21:13 crc kubenswrapper[5036]: I0110 17:21:13.047709 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6bzd\" (UniqueName: \"kubernetes.io/projected/ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4-kube-api-access-f6bzd\") on node \"crc\" DevicePath \"\"" Jan 10 17:21:13 crc kubenswrapper[5036]: I0110 17:21:13.079245 5036 generic.go:334] "Generic (PLEG): container finished" podID="ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4" containerID="2c0af834de375a4065cb5db77fef29566f89676be58e819e1e45b3200e9b6356" exitCode=0 Jan 10 17:21:13 crc kubenswrapper[5036]: I0110 17:21:13.079379 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srt8f" event={"ID":"ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4","Type":"ContainerDied","Data":"2c0af834de375a4065cb5db77fef29566f89676be58e819e1e45b3200e9b6356"} Jan 10 17:21:13 crc kubenswrapper[5036]: I0110 17:21:13.079431 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srt8f" event={"ID":"ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4","Type":"ContainerDied","Data":"1b61fb8cb267ca3f425f7a90dbb714789e7836c5ff75c94a0a934c937ac3e710"} Jan 10 17:21:13 crc kubenswrapper[5036]: I0110 17:21:13.079458 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-srt8f" Jan 10 17:21:13 crc kubenswrapper[5036]: I0110 17:21:13.079468 5036 scope.go:117] "RemoveContainer" containerID="2c0af834de375a4065cb5db77fef29566f89676be58e819e1e45b3200e9b6356" Jan 10 17:21:13 crc kubenswrapper[5036]: I0110 17:21:13.112205 5036 scope.go:117] "RemoveContainer" containerID="4216c5de7359a36414b0ab9ddc1d3bef78a3ab8156cf44e9f0b4694548a5e9ec" Jan 10 17:21:13 crc kubenswrapper[5036]: I0110 17:21:13.126110 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-srt8f"] Jan 10 17:21:13 crc kubenswrapper[5036]: I0110 17:21:13.135034 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-srt8f"] Jan 10 17:21:13 crc kubenswrapper[5036]: I0110 17:21:13.160218 5036 scope.go:117] "RemoveContainer" containerID="32f2de29a6d076acbb783a56b967b6703fbcf7c71e42e23e3ee8170aee82fca5" Jan 10 17:21:13 crc kubenswrapper[5036]: I0110 17:21:13.202090 5036 scope.go:117] "RemoveContainer" containerID="2c0af834de375a4065cb5db77fef29566f89676be58e819e1e45b3200e9b6356" Jan 10 17:21:13 crc kubenswrapper[5036]: E0110 17:21:13.202666 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c0af834de375a4065cb5db77fef29566f89676be58e819e1e45b3200e9b6356\": container with ID starting with 2c0af834de375a4065cb5db77fef29566f89676be58e819e1e45b3200e9b6356 not found: ID does not exist" containerID="2c0af834de375a4065cb5db77fef29566f89676be58e819e1e45b3200e9b6356" Jan 10 17:21:13 crc kubenswrapper[5036]: I0110 17:21:13.202738 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c0af834de375a4065cb5db77fef29566f89676be58e819e1e45b3200e9b6356"} err="failed to get container status \"2c0af834de375a4065cb5db77fef29566f89676be58e819e1e45b3200e9b6356\": rpc error: code = NotFound desc = could not find container \"2c0af834de375a4065cb5db77fef29566f89676be58e819e1e45b3200e9b6356\": container with ID starting with 2c0af834de375a4065cb5db77fef29566f89676be58e819e1e45b3200e9b6356 not found: ID does not exist" Jan 10 17:21:13 crc kubenswrapper[5036]: I0110 17:21:13.202770 5036 scope.go:117] "RemoveContainer" containerID="4216c5de7359a36414b0ab9ddc1d3bef78a3ab8156cf44e9f0b4694548a5e9ec" Jan 10 17:21:13 crc kubenswrapper[5036]: E0110 17:21:13.203219 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4216c5de7359a36414b0ab9ddc1d3bef78a3ab8156cf44e9f0b4694548a5e9ec\": container with ID starting with 4216c5de7359a36414b0ab9ddc1d3bef78a3ab8156cf44e9f0b4694548a5e9ec not found: ID does not exist" containerID="4216c5de7359a36414b0ab9ddc1d3bef78a3ab8156cf44e9f0b4694548a5e9ec" Jan 10 17:21:13 crc kubenswrapper[5036]: I0110 17:21:13.203248 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4216c5de7359a36414b0ab9ddc1d3bef78a3ab8156cf44e9f0b4694548a5e9ec"} err="failed to get container status \"4216c5de7359a36414b0ab9ddc1d3bef78a3ab8156cf44e9f0b4694548a5e9ec\": rpc error: code = NotFound desc = could not find container \"4216c5de7359a36414b0ab9ddc1d3bef78a3ab8156cf44e9f0b4694548a5e9ec\": container with ID starting with 4216c5de7359a36414b0ab9ddc1d3bef78a3ab8156cf44e9f0b4694548a5e9ec not found: ID does not exist" Jan 10 17:21:13 crc kubenswrapper[5036]: I0110 17:21:13.203270 5036 scope.go:117] "RemoveContainer" containerID="32f2de29a6d076acbb783a56b967b6703fbcf7c71e42e23e3ee8170aee82fca5" Jan 10 17:21:13 crc kubenswrapper[5036]: E0110 17:21:13.203518 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32f2de29a6d076acbb783a56b967b6703fbcf7c71e42e23e3ee8170aee82fca5\": container with ID starting with 32f2de29a6d076acbb783a56b967b6703fbcf7c71e42e23e3ee8170aee82fca5 not found: ID does not exist" containerID="32f2de29a6d076acbb783a56b967b6703fbcf7c71e42e23e3ee8170aee82fca5" Jan 10 17:21:13 crc kubenswrapper[5036]: I0110 17:21:13.203541 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f2de29a6d076acbb783a56b967b6703fbcf7c71e42e23e3ee8170aee82fca5"} err="failed to get container status \"32f2de29a6d076acbb783a56b967b6703fbcf7c71e42e23e3ee8170aee82fca5\": rpc error: code = NotFound desc = could not find container \"32f2de29a6d076acbb783a56b967b6703fbcf7c71e42e23e3ee8170aee82fca5\": container with ID starting with 32f2de29a6d076acbb783a56b967b6703fbcf7c71e42e23e3ee8170aee82fca5 not found: ID does not exist" Jan 10 17:21:14 crc kubenswrapper[5036]: I0110 17:21:14.524395 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4" path="/var/lib/kubelet/pods/ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4/volumes" Jan 10 17:21:16 crc kubenswrapper[5036]: I0110 17:21:16.508637 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:21:16 crc kubenswrapper[5036]: E0110 17:21:16.511100 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:21:30 crc kubenswrapper[5036]: I0110 17:21:30.508487 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:21:30 crc kubenswrapper[5036]: E0110 17:21:30.509155 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:21:43 crc kubenswrapper[5036]: I0110 17:21:43.509078 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:21:43 crc kubenswrapper[5036]: E0110 17:21:43.510362 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:21:55 crc kubenswrapper[5036]: I0110 17:21:55.509127 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:21:55 crc kubenswrapper[5036]: E0110 17:21:55.510179 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:22:02 crc kubenswrapper[5036]: I0110 17:22:02.643171 5036 generic.go:334] "Generic (PLEG): container finished" podID="d98e00b3-6224-462a-abe0-52e09ac44fb8" containerID="dfc18f8e25aba213b02efdda70f8cb0c8d1b667257c3992acd9750f29e60322d" exitCode=0 Jan 10 17:22:02 crc kubenswrapper[5036]: I0110 17:22:02.643264 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d98e00b3-6224-462a-abe0-52e09ac44fb8","Type":"ContainerDied","Data":"dfc18f8e25aba213b02efdda70f8cb0c8d1b667257c3992acd9750f29e60322d"} Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.107467 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.174861 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d98e00b3-6224-462a-abe0-52e09ac44fb8-test-operator-ephemeral-workdir\") pod \"d98e00b3-6224-462a-abe0-52e09ac44fb8\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.175005 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2gnp\" (UniqueName: \"kubernetes.io/projected/d98e00b3-6224-462a-abe0-52e09ac44fb8-kube-api-access-w2gnp\") pod \"d98e00b3-6224-462a-abe0-52e09ac44fb8\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.175058 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d98e00b3-6224-462a-abe0-52e09ac44fb8-ca-certs\") pod \"d98e00b3-6224-462a-abe0-52e09ac44fb8\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.175134 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"d98e00b3-6224-462a-abe0-52e09ac44fb8\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.175186 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d98e00b3-6224-462a-abe0-52e09ac44fb8-config-data\") pod \"d98e00b3-6224-462a-abe0-52e09ac44fb8\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.175223 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d98e00b3-6224-462a-abe0-52e09ac44fb8-openstack-config-secret\") pod \"d98e00b3-6224-462a-abe0-52e09ac44fb8\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.175334 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d98e00b3-6224-462a-abe0-52e09ac44fb8-test-operator-ephemeral-temporary\") pod \"d98e00b3-6224-462a-abe0-52e09ac44fb8\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.175431 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d98e00b3-6224-462a-abe0-52e09ac44fb8-openstack-config\") pod \"d98e00b3-6224-462a-abe0-52e09ac44fb8\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.175478 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d98e00b3-6224-462a-abe0-52e09ac44fb8-ssh-key\") pod \"d98e00b3-6224-462a-abe0-52e09ac44fb8\" (UID: \"d98e00b3-6224-462a-abe0-52e09ac44fb8\") " Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.176526 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d98e00b3-6224-462a-abe0-52e09ac44fb8-config-data" (OuterVolumeSpecName: "config-data") pod "d98e00b3-6224-462a-abe0-52e09ac44fb8" (UID: "d98e00b3-6224-462a-abe0-52e09ac44fb8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.176926 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d98e00b3-6224-462a-abe0-52e09ac44fb8-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "d98e00b3-6224-462a-abe0-52e09ac44fb8" (UID: "d98e00b3-6224-462a-abe0-52e09ac44fb8"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.182946 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d98e00b3-6224-462a-abe0-52e09ac44fb8-kube-api-access-w2gnp" (OuterVolumeSpecName: "kube-api-access-w2gnp") pod "d98e00b3-6224-462a-abe0-52e09ac44fb8" (UID: "d98e00b3-6224-462a-abe0-52e09ac44fb8"). InnerVolumeSpecName "kube-api-access-w2gnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.184559 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d98e00b3-6224-462a-abe0-52e09ac44fb8-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "d98e00b3-6224-462a-abe0-52e09ac44fb8" (UID: "d98e00b3-6224-462a-abe0-52e09ac44fb8"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.194278 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "d98e00b3-6224-462a-abe0-52e09ac44fb8" (UID: "d98e00b3-6224-462a-abe0-52e09ac44fb8"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.213794 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d98e00b3-6224-462a-abe0-52e09ac44fb8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d98e00b3-6224-462a-abe0-52e09ac44fb8" (UID: "d98e00b3-6224-462a-abe0-52e09ac44fb8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.239877 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d98e00b3-6224-462a-abe0-52e09ac44fb8-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "d98e00b3-6224-462a-abe0-52e09ac44fb8" (UID: "d98e00b3-6224-462a-abe0-52e09ac44fb8"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.241049 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d98e00b3-6224-462a-abe0-52e09ac44fb8-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "d98e00b3-6224-462a-abe0-52e09ac44fb8" (UID: "d98e00b3-6224-462a-abe0-52e09ac44fb8"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.276338 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d98e00b3-6224-462a-abe0-52e09ac44fb8-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "d98e00b3-6224-462a-abe0-52e09ac44fb8" (UID: "d98e00b3-6224-462a-abe0-52e09ac44fb8"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.277795 5036 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d98e00b3-6224-462a-abe0-52e09ac44fb8-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.277820 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2gnp\" (UniqueName: \"kubernetes.io/projected/d98e00b3-6224-462a-abe0-52e09ac44fb8-kube-api-access-w2gnp\") on node \"crc\" DevicePath \"\"" Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.277830 5036 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d98e00b3-6224-462a-abe0-52e09ac44fb8-ca-certs\") on node \"crc\" DevicePath \"\"" Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.277880 5036 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.277892 5036 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d98e00b3-6224-462a-abe0-52e09ac44fb8-config-data\") on node \"crc\" DevicePath \"\"" Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.277902 5036 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d98e00b3-6224-462a-abe0-52e09ac44fb8-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.277931 5036 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d98e00b3-6224-462a-abe0-52e09ac44fb8-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.277942 5036 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d98e00b3-6224-462a-abe0-52e09ac44fb8-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.277952 5036 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d98e00b3-6224-462a-abe0-52e09ac44fb8-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.305861 5036 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.381417 5036 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.661884 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d98e00b3-6224-462a-abe0-52e09ac44fb8","Type":"ContainerDied","Data":"efe08ff2da51b6fc3bede9a4ea9c6a4423ad62626df2d765702e24b4cdd730e1"} Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.662184 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efe08ff2da51b6fc3bede9a4ea9c6a4423ad62626df2d765702e24b4cdd730e1" Jan 10 17:22:04 crc kubenswrapper[5036]: I0110 17:22:04.661940 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 10 17:22:07 crc kubenswrapper[5036]: I0110 17:22:07.509129 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:22:07 crc kubenswrapper[5036]: E0110 17:22:07.509906 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:22:16 crc kubenswrapper[5036]: I0110 17:22:16.976832 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 10 17:22:16 crc kubenswrapper[5036]: E0110 17:22:16.977656 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4" containerName="registry-server" Jan 10 17:22:16 crc kubenswrapper[5036]: I0110 17:22:16.977666 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4" containerName="registry-server" Jan 10 17:22:16 crc kubenswrapper[5036]: E0110 17:22:16.977707 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4" containerName="extract-utilities" Jan 10 17:22:16 crc kubenswrapper[5036]: I0110 17:22:16.977713 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4" containerName="extract-utilities" Jan 10 17:22:16 crc kubenswrapper[5036]: E0110 17:22:16.977732 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98e00b3-6224-462a-abe0-52e09ac44fb8" containerName="tempest-tests-tempest-tests-runner" Jan 10 17:22:16 crc kubenswrapper[5036]: I0110 17:22:16.977738 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98e00b3-6224-462a-abe0-52e09ac44fb8" containerName="tempest-tests-tempest-tests-runner" Jan 10 17:22:16 crc kubenswrapper[5036]: E0110 17:22:16.977755 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4" containerName="extract-content" Jan 10 17:22:16 crc kubenswrapper[5036]: I0110 17:22:16.977760 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4" containerName="extract-content" Jan 10 17:22:16 crc kubenswrapper[5036]: I0110 17:22:16.977922 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff7c332e-cd66-47cb-ab4d-fccf3d80c1b4" containerName="registry-server" Jan 10 17:22:16 crc kubenswrapper[5036]: I0110 17:22:16.977935 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="d98e00b3-6224-462a-abe0-52e09ac44fb8" containerName="tempest-tests-tempest-tests-runner" Jan 10 17:22:16 crc kubenswrapper[5036]: I0110 17:22:16.978533 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 10 17:22:16 crc kubenswrapper[5036]: I0110 17:22:16.981260 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-2frgm" Jan 10 17:22:16 crc kubenswrapper[5036]: I0110 17:22:16.987365 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 10 17:22:17 crc kubenswrapper[5036]: I0110 17:22:17.044150 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"bbb64716-afc7-4c1a-be2f-2ff9cc886e96\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 10 17:22:17 crc kubenswrapper[5036]: I0110 17:22:17.044328 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pj97\" (UniqueName: \"kubernetes.io/projected/bbb64716-afc7-4c1a-be2f-2ff9cc886e96-kube-api-access-8pj97\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"bbb64716-afc7-4c1a-be2f-2ff9cc886e96\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 10 17:22:17 crc kubenswrapper[5036]: I0110 17:22:17.146772 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"bbb64716-afc7-4c1a-be2f-2ff9cc886e96\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 10 17:22:17 crc kubenswrapper[5036]: I0110 17:22:17.147053 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pj97\" (UniqueName: \"kubernetes.io/projected/bbb64716-afc7-4c1a-be2f-2ff9cc886e96-kube-api-access-8pj97\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"bbb64716-afc7-4c1a-be2f-2ff9cc886e96\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 10 17:22:17 crc kubenswrapper[5036]: I0110 17:22:17.147254 5036 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"bbb64716-afc7-4c1a-be2f-2ff9cc886e96\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 10 17:22:17 crc kubenswrapper[5036]: I0110 17:22:17.173915 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pj97\" (UniqueName: \"kubernetes.io/projected/bbb64716-afc7-4c1a-be2f-2ff9cc886e96-kube-api-access-8pj97\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"bbb64716-afc7-4c1a-be2f-2ff9cc886e96\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 10 17:22:17 crc kubenswrapper[5036]: I0110 17:22:17.175700 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"bbb64716-afc7-4c1a-be2f-2ff9cc886e96\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 10 17:22:17 crc kubenswrapper[5036]: I0110 17:22:17.302785 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 10 17:22:17 crc kubenswrapper[5036]: I0110 17:22:17.908145 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 10 17:22:17 crc kubenswrapper[5036]: I0110 17:22:17.936153 5036 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 10 17:22:18 crc kubenswrapper[5036]: I0110 17:22:18.801162 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"bbb64716-afc7-4c1a-be2f-2ff9cc886e96","Type":"ContainerStarted","Data":"160752d5bc81670a894d81d489e36648a057c125f52f0fa1a1f0c72c70daefff"} Jan 10 17:22:19 crc kubenswrapper[5036]: I0110 17:22:19.816620 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"bbb64716-afc7-4c1a-be2f-2ff9cc886e96","Type":"ContainerStarted","Data":"1d1c904ec43007877603cf318088977d337a9ef57d9bd60e6a1a2072de5e43d1"} Jan 10 17:22:19 crc kubenswrapper[5036]: I0110 17:22:19.846087 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.706518685 podStartE2EDuration="3.846059853s" podCreationTimestamp="2026-01-10 17:22:16 +0000 UTC" firstStartedPulling="2026-01-10 17:22:17.935894685 +0000 UTC m=+3259.806130179" lastFinishedPulling="2026-01-10 17:22:19.075435843 +0000 UTC m=+3260.945671347" observedRunningTime="2026-01-10 17:22:19.836311676 +0000 UTC m=+3261.706547190" watchObservedRunningTime="2026-01-10 17:22:19.846059853 +0000 UTC m=+3261.716295377" Jan 10 17:22:21 crc kubenswrapper[5036]: I0110 17:22:21.509311 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:22:21 crc kubenswrapper[5036]: E0110 17:22:21.510061 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:22:24 crc kubenswrapper[5036]: I0110 17:22:24.189386 5036 scope.go:117] "RemoveContainer" containerID="088b6f96a798183b597c6041d9cff8936c5b19f8ab1c3680df9108d8aca41ab0" Jan 10 17:22:24 crc kubenswrapper[5036]: I0110 17:22:24.213544 5036 scope.go:117] "RemoveContainer" containerID="73db86f898d4868807c8cb5337217949e7654dcf3bbcb3521f1ffef7a31582d6" Jan 10 17:22:24 crc kubenswrapper[5036]: I0110 17:22:24.374588 5036 scope.go:117] "RemoveContainer" containerID="c47503d9d239a27615fac186d4db643215e81fe8554510b4c3096096c664f607" Jan 10 17:22:24 crc kubenswrapper[5036]: I0110 17:22:24.394996 5036 scope.go:117] "RemoveContainer" containerID="33f3578f36c2eac18c7decc55a04cf76ce07f0a11f1219e396288278723db621" Jan 10 17:22:34 crc kubenswrapper[5036]: I0110 17:22:34.514334 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:22:34 crc kubenswrapper[5036]: E0110 17:22:34.515361 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:22:42 crc kubenswrapper[5036]: I0110 17:22:42.998581 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kdbxc/must-gather-skwkm"] Jan 10 17:22:43 crc kubenswrapper[5036]: I0110 17:22:43.000641 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdbxc/must-gather-skwkm" Jan 10 17:22:43 crc kubenswrapper[5036]: I0110 17:22:43.006133 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kdbxc"/"kube-root-ca.crt" Jan 10 17:22:43 crc kubenswrapper[5036]: I0110 17:22:43.006256 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kdbxc"/"openshift-service-ca.crt" Jan 10 17:22:43 crc kubenswrapper[5036]: I0110 17:22:43.019496 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kdbxc/must-gather-skwkm"] Jan 10 17:22:43 crc kubenswrapper[5036]: I0110 17:22:43.068157 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e020151a-987f-4113-8b01-ea7b8c426757-must-gather-output\") pod \"must-gather-skwkm\" (UID: \"e020151a-987f-4113-8b01-ea7b8c426757\") " pod="openshift-must-gather-kdbxc/must-gather-skwkm" Jan 10 17:22:43 crc kubenswrapper[5036]: I0110 17:22:43.068315 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxbd7\" (UniqueName: \"kubernetes.io/projected/e020151a-987f-4113-8b01-ea7b8c426757-kube-api-access-hxbd7\") pod \"must-gather-skwkm\" (UID: \"e020151a-987f-4113-8b01-ea7b8c426757\") " pod="openshift-must-gather-kdbxc/must-gather-skwkm" Jan 10 17:22:43 crc kubenswrapper[5036]: I0110 17:22:43.170711 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e020151a-987f-4113-8b01-ea7b8c426757-must-gather-output\") pod \"must-gather-skwkm\" (UID: \"e020151a-987f-4113-8b01-ea7b8c426757\") " pod="openshift-must-gather-kdbxc/must-gather-skwkm" Jan 10 17:22:43 crc kubenswrapper[5036]: I0110 17:22:43.170887 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxbd7\" (UniqueName: \"kubernetes.io/projected/e020151a-987f-4113-8b01-ea7b8c426757-kube-api-access-hxbd7\") pod \"must-gather-skwkm\" (UID: \"e020151a-987f-4113-8b01-ea7b8c426757\") " pod="openshift-must-gather-kdbxc/must-gather-skwkm" Jan 10 17:22:43 crc kubenswrapper[5036]: I0110 17:22:43.171214 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e020151a-987f-4113-8b01-ea7b8c426757-must-gather-output\") pod \"must-gather-skwkm\" (UID: \"e020151a-987f-4113-8b01-ea7b8c426757\") " pod="openshift-must-gather-kdbxc/must-gather-skwkm" Jan 10 17:22:43 crc kubenswrapper[5036]: I0110 17:22:43.190478 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxbd7\" (UniqueName: \"kubernetes.io/projected/e020151a-987f-4113-8b01-ea7b8c426757-kube-api-access-hxbd7\") pod \"must-gather-skwkm\" (UID: \"e020151a-987f-4113-8b01-ea7b8c426757\") " pod="openshift-must-gather-kdbxc/must-gather-skwkm" Jan 10 17:22:43 crc kubenswrapper[5036]: I0110 17:22:43.316961 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdbxc/must-gather-skwkm" Jan 10 17:22:43 crc kubenswrapper[5036]: I0110 17:22:43.852109 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kdbxc/must-gather-skwkm"] Jan 10 17:22:44 crc kubenswrapper[5036]: I0110 17:22:44.113743 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kdbxc/must-gather-skwkm" event={"ID":"e020151a-987f-4113-8b01-ea7b8c426757","Type":"ContainerStarted","Data":"e3e635c7ee181dad1a65783fa9890873674ccbdf1f6e09aa12e784dd4ecea277"} Jan 10 17:22:45 crc kubenswrapper[5036]: I0110 17:22:45.507947 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:22:45 crc kubenswrapper[5036]: E0110 17:22:45.508587 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:22:51 crc kubenswrapper[5036]: I0110 17:22:51.174073 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kdbxc/must-gather-skwkm" event={"ID":"e020151a-987f-4113-8b01-ea7b8c426757","Type":"ContainerStarted","Data":"7d82dd1760bdf87a728147e0dc6f87fe82dbd93a6d2afba36f68be8def68f608"} Jan 10 17:22:51 crc kubenswrapper[5036]: I0110 17:22:51.174730 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kdbxc/must-gather-skwkm" event={"ID":"e020151a-987f-4113-8b01-ea7b8c426757","Type":"ContainerStarted","Data":"6747abe413de75eb2a719869d77425fae684a08abebc747aac7b123f41f1a2ee"} Jan 10 17:22:51 crc kubenswrapper[5036]: I0110 17:22:51.213576 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kdbxc/must-gather-skwkm" podStartSLOduration=2.810608031 podStartE2EDuration="9.213548643s" podCreationTimestamp="2026-01-10 17:22:42 +0000 UTC" firstStartedPulling="2026-01-10 17:22:43.859701221 +0000 UTC m=+3285.729936715" lastFinishedPulling="2026-01-10 17:22:50.262641833 +0000 UTC m=+3292.132877327" observedRunningTime="2026-01-10 17:22:51.198567148 +0000 UTC m=+3293.068802682" watchObservedRunningTime="2026-01-10 17:22:51.213548643 +0000 UTC m=+3293.083784177" Jan 10 17:22:54 crc kubenswrapper[5036]: I0110 17:22:54.571348 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kdbxc/crc-debug-cm9h7"] Jan 10 17:22:54 crc kubenswrapper[5036]: I0110 17:22:54.573804 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdbxc/crc-debug-cm9h7" Jan 10 17:22:54 crc kubenswrapper[5036]: I0110 17:22:54.577814 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-kdbxc"/"default-dockercfg-w6qgg" Jan 10 17:22:54 crc kubenswrapper[5036]: I0110 17:22:54.618299 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxn7t\" (UniqueName: \"kubernetes.io/projected/a3031610-5147-40ce-9405-27e6a6b816bd-kube-api-access-wxn7t\") pod \"crc-debug-cm9h7\" (UID: \"a3031610-5147-40ce-9405-27e6a6b816bd\") " pod="openshift-must-gather-kdbxc/crc-debug-cm9h7" Jan 10 17:22:54 crc kubenswrapper[5036]: I0110 17:22:54.618603 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3031610-5147-40ce-9405-27e6a6b816bd-host\") pod \"crc-debug-cm9h7\" (UID: \"a3031610-5147-40ce-9405-27e6a6b816bd\") " pod="openshift-must-gather-kdbxc/crc-debug-cm9h7" Jan 10 17:22:54 crc kubenswrapper[5036]: I0110 17:22:54.720604 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3031610-5147-40ce-9405-27e6a6b816bd-host\") pod \"crc-debug-cm9h7\" (UID: \"a3031610-5147-40ce-9405-27e6a6b816bd\") " pod="openshift-must-gather-kdbxc/crc-debug-cm9h7" Jan 10 17:22:54 crc kubenswrapper[5036]: I0110 17:22:54.720745 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxn7t\" (UniqueName: \"kubernetes.io/projected/a3031610-5147-40ce-9405-27e6a6b816bd-kube-api-access-wxn7t\") pod \"crc-debug-cm9h7\" (UID: \"a3031610-5147-40ce-9405-27e6a6b816bd\") " pod="openshift-must-gather-kdbxc/crc-debug-cm9h7" Jan 10 17:22:54 crc kubenswrapper[5036]: I0110 17:22:54.720768 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3031610-5147-40ce-9405-27e6a6b816bd-host\") pod \"crc-debug-cm9h7\" (UID: \"a3031610-5147-40ce-9405-27e6a6b816bd\") " pod="openshift-must-gather-kdbxc/crc-debug-cm9h7" Jan 10 17:22:54 crc kubenswrapper[5036]: I0110 17:22:54.741139 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxn7t\" (UniqueName: \"kubernetes.io/projected/a3031610-5147-40ce-9405-27e6a6b816bd-kube-api-access-wxn7t\") pod \"crc-debug-cm9h7\" (UID: \"a3031610-5147-40ce-9405-27e6a6b816bd\") " pod="openshift-must-gather-kdbxc/crc-debug-cm9h7" Jan 10 17:22:54 crc kubenswrapper[5036]: I0110 17:22:54.891280 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdbxc/crc-debug-cm9h7" Jan 10 17:22:55 crc kubenswrapper[5036]: I0110 17:22:55.211473 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kdbxc/crc-debug-cm9h7" event={"ID":"a3031610-5147-40ce-9405-27e6a6b816bd","Type":"ContainerStarted","Data":"2da8112748e65727bf8c6f98280036545ed86cb46715e023b06993c476ffc77c"} Jan 10 17:22:59 crc kubenswrapper[5036]: I0110 17:22:59.508267 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:22:59 crc kubenswrapper[5036]: E0110 17:22:59.508958 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:23:06 crc kubenswrapper[5036]: I0110 17:23:06.348291 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kdbxc/crc-debug-cm9h7" event={"ID":"a3031610-5147-40ce-9405-27e6a6b816bd","Type":"ContainerStarted","Data":"bffe98affa727d66ad1f45903a677bb512bc7e30dc73302bfd19af92497e31bb"} Jan 10 17:23:06 crc kubenswrapper[5036]: I0110 17:23:06.377144 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kdbxc/crc-debug-cm9h7" podStartSLOduration=1.400067282 podStartE2EDuration="12.377115152s" podCreationTimestamp="2026-01-10 17:22:54 +0000 UTC" firstStartedPulling="2026-01-10 17:22:54.936217974 +0000 UTC m=+3296.806453468" lastFinishedPulling="2026-01-10 17:23:05.913265824 +0000 UTC m=+3307.783501338" observedRunningTime="2026-01-10 17:23:06.363728213 +0000 UTC m=+3308.233963707" watchObservedRunningTime="2026-01-10 17:23:06.377115152 +0000 UTC m=+3308.247350646" Jan 10 17:23:12 crc kubenswrapper[5036]: I0110 17:23:12.508172 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:23:12 crc kubenswrapper[5036]: E0110 17:23:12.509008 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:23:23 crc kubenswrapper[5036]: I0110 17:23:23.541657 5036 generic.go:334] "Generic (PLEG): container finished" podID="a3031610-5147-40ce-9405-27e6a6b816bd" containerID="bffe98affa727d66ad1f45903a677bb512bc7e30dc73302bfd19af92497e31bb" exitCode=0 Jan 10 17:23:23 crc kubenswrapper[5036]: I0110 17:23:23.541728 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kdbxc/crc-debug-cm9h7" event={"ID":"a3031610-5147-40ce-9405-27e6a6b816bd","Type":"ContainerDied","Data":"bffe98affa727d66ad1f45903a677bb512bc7e30dc73302bfd19af92497e31bb"} Jan 10 17:23:24 crc kubenswrapper[5036]: I0110 17:23:24.633990 5036 scope.go:117] "RemoveContainer" containerID="ac3c0ab9e2ed742f6b803953f2724eec344458a41fbbabf735a942b705f67e74" Jan 10 17:23:24 crc kubenswrapper[5036]: I0110 17:23:24.660177 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdbxc/crc-debug-cm9h7" Jan 10 17:23:24 crc kubenswrapper[5036]: I0110 17:23:24.663909 5036 scope.go:117] "RemoveContainer" containerID="fb3b2de3cfe17c5bca16987ad741e60ba91bd4182aacd2d95cadc390fa5a4af2" Jan 10 17:23:24 crc kubenswrapper[5036]: I0110 17:23:24.697899 5036 scope.go:117] "RemoveContainer" containerID="9d2294717b146cbd2b8b0ab6374b632635aea4b71294b2437e9eed6a6553654d" Jan 10 17:23:24 crc kubenswrapper[5036]: I0110 17:23:24.712468 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kdbxc/crc-debug-cm9h7"] Jan 10 17:23:24 crc kubenswrapper[5036]: I0110 17:23:24.719967 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kdbxc/crc-debug-cm9h7"] Jan 10 17:23:24 crc kubenswrapper[5036]: I0110 17:23:24.722073 5036 scope.go:117] "RemoveContainer" containerID="16f9da10d39696b5227fcb0dd892cf0faf4ab27401021cf1d467902619b84878" Jan 10 17:23:24 crc kubenswrapper[5036]: I0110 17:23:24.836132 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3031610-5147-40ce-9405-27e6a6b816bd-host\") pod \"a3031610-5147-40ce-9405-27e6a6b816bd\" (UID: \"a3031610-5147-40ce-9405-27e6a6b816bd\") " Jan 10 17:23:24 crc kubenswrapper[5036]: I0110 17:23:24.836281 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3031610-5147-40ce-9405-27e6a6b816bd-host" (OuterVolumeSpecName: "host") pod "a3031610-5147-40ce-9405-27e6a6b816bd" (UID: "a3031610-5147-40ce-9405-27e6a6b816bd"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 17:23:24 crc kubenswrapper[5036]: I0110 17:23:24.836395 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxn7t\" (UniqueName: \"kubernetes.io/projected/a3031610-5147-40ce-9405-27e6a6b816bd-kube-api-access-wxn7t\") pod \"a3031610-5147-40ce-9405-27e6a6b816bd\" (UID: \"a3031610-5147-40ce-9405-27e6a6b816bd\") " Jan 10 17:23:24 crc kubenswrapper[5036]: I0110 17:23:24.837040 5036 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3031610-5147-40ce-9405-27e6a6b816bd-host\") on node \"crc\" DevicePath \"\"" Jan 10 17:23:24 crc kubenswrapper[5036]: I0110 17:23:24.850489 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3031610-5147-40ce-9405-27e6a6b816bd-kube-api-access-wxn7t" (OuterVolumeSpecName: "kube-api-access-wxn7t") pod "a3031610-5147-40ce-9405-27e6a6b816bd" (UID: "a3031610-5147-40ce-9405-27e6a6b816bd"). InnerVolumeSpecName "kube-api-access-wxn7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:23:24 crc kubenswrapper[5036]: I0110 17:23:24.938902 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxn7t\" (UniqueName: \"kubernetes.io/projected/a3031610-5147-40ce-9405-27e6a6b816bd-kube-api-access-wxn7t\") on node \"crc\" DevicePath \"\"" Jan 10 17:23:25 crc kubenswrapper[5036]: I0110 17:23:25.560923 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2da8112748e65727bf8c6f98280036545ed86cb46715e023b06993c476ffc77c" Jan 10 17:23:25 crc kubenswrapper[5036]: I0110 17:23:25.561056 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdbxc/crc-debug-cm9h7" Jan 10 17:23:25 crc kubenswrapper[5036]: I0110 17:23:25.932168 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kdbxc/crc-debug-59swz"] Jan 10 17:23:25 crc kubenswrapper[5036]: E0110 17:23:25.933004 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3031610-5147-40ce-9405-27e6a6b816bd" containerName="container-00" Jan 10 17:23:25 crc kubenswrapper[5036]: I0110 17:23:25.933019 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3031610-5147-40ce-9405-27e6a6b816bd" containerName="container-00" Jan 10 17:23:25 crc kubenswrapper[5036]: I0110 17:23:25.933184 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3031610-5147-40ce-9405-27e6a6b816bd" containerName="container-00" Jan 10 17:23:25 crc kubenswrapper[5036]: I0110 17:23:25.933825 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdbxc/crc-debug-59swz" Jan 10 17:23:25 crc kubenswrapper[5036]: I0110 17:23:25.936085 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-kdbxc"/"default-dockercfg-w6qgg" Jan 10 17:23:26 crc kubenswrapper[5036]: I0110 17:23:26.060389 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6cc6\" (UniqueName: \"kubernetes.io/projected/5b6e98b1-1261-48fc-9b61-03da6381385a-kube-api-access-t6cc6\") pod \"crc-debug-59swz\" (UID: \"5b6e98b1-1261-48fc-9b61-03da6381385a\") " pod="openshift-must-gather-kdbxc/crc-debug-59swz" Jan 10 17:23:26 crc kubenswrapper[5036]: I0110 17:23:26.060468 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b6e98b1-1261-48fc-9b61-03da6381385a-host\") pod \"crc-debug-59swz\" (UID: \"5b6e98b1-1261-48fc-9b61-03da6381385a\") " pod="openshift-must-gather-kdbxc/crc-debug-59swz" Jan 10 17:23:26 crc kubenswrapper[5036]: I0110 17:23:26.162177 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6cc6\" (UniqueName: \"kubernetes.io/projected/5b6e98b1-1261-48fc-9b61-03da6381385a-kube-api-access-t6cc6\") pod \"crc-debug-59swz\" (UID: \"5b6e98b1-1261-48fc-9b61-03da6381385a\") " pod="openshift-must-gather-kdbxc/crc-debug-59swz" Jan 10 17:23:26 crc kubenswrapper[5036]: I0110 17:23:26.162221 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b6e98b1-1261-48fc-9b61-03da6381385a-host\") pod \"crc-debug-59swz\" (UID: \"5b6e98b1-1261-48fc-9b61-03da6381385a\") " pod="openshift-must-gather-kdbxc/crc-debug-59swz" Jan 10 17:23:26 crc kubenswrapper[5036]: I0110 17:23:26.162348 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b6e98b1-1261-48fc-9b61-03da6381385a-host\") pod \"crc-debug-59swz\" (UID: \"5b6e98b1-1261-48fc-9b61-03da6381385a\") " pod="openshift-must-gather-kdbxc/crc-debug-59swz" Jan 10 17:23:26 crc kubenswrapper[5036]: I0110 17:23:26.179599 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6cc6\" (UniqueName: \"kubernetes.io/projected/5b6e98b1-1261-48fc-9b61-03da6381385a-kube-api-access-t6cc6\") pod \"crc-debug-59swz\" (UID: \"5b6e98b1-1261-48fc-9b61-03da6381385a\") " pod="openshift-must-gather-kdbxc/crc-debug-59swz" Jan 10 17:23:26 crc kubenswrapper[5036]: I0110 17:23:26.248632 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdbxc/crc-debug-59swz" Jan 10 17:23:26 crc kubenswrapper[5036]: I0110 17:23:26.510475 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:23:26 crc kubenswrapper[5036]: E0110 17:23:26.510995 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:23:26 crc kubenswrapper[5036]: I0110 17:23:26.524917 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3031610-5147-40ce-9405-27e6a6b816bd" path="/var/lib/kubelet/pods/a3031610-5147-40ce-9405-27e6a6b816bd/volumes" Jan 10 17:23:26 crc kubenswrapper[5036]: I0110 17:23:26.569255 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kdbxc/crc-debug-59swz" event={"ID":"5b6e98b1-1261-48fc-9b61-03da6381385a","Type":"ContainerStarted","Data":"29d64659a91ce65b546bd5a3800f45ae7a45a8fb3df3a5449ceec41aa3239c73"} Jan 10 17:23:27 crc kubenswrapper[5036]: I0110 17:23:27.579364 5036 generic.go:334] "Generic (PLEG): container finished" podID="5b6e98b1-1261-48fc-9b61-03da6381385a" containerID="52d91b26810e7ffcb8e33003997c78f92c2a1bd7cba613ce09fd2182d29bc267" exitCode=1 Jan 10 17:23:27 crc kubenswrapper[5036]: I0110 17:23:27.579481 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kdbxc/crc-debug-59swz" event={"ID":"5b6e98b1-1261-48fc-9b61-03da6381385a","Type":"ContainerDied","Data":"52d91b26810e7ffcb8e33003997c78f92c2a1bd7cba613ce09fd2182d29bc267"} Jan 10 17:23:27 crc kubenswrapper[5036]: I0110 17:23:27.620162 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kdbxc/crc-debug-59swz"] Jan 10 17:23:27 crc kubenswrapper[5036]: I0110 17:23:27.632301 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kdbxc/crc-debug-59swz"] Jan 10 17:23:28 crc kubenswrapper[5036]: I0110 17:23:28.702846 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdbxc/crc-debug-59swz" Jan 10 17:23:28 crc kubenswrapper[5036]: I0110 17:23:28.814951 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b6e98b1-1261-48fc-9b61-03da6381385a-host\") pod \"5b6e98b1-1261-48fc-9b61-03da6381385a\" (UID: \"5b6e98b1-1261-48fc-9b61-03da6381385a\") " Jan 10 17:23:28 crc kubenswrapper[5036]: I0110 17:23:28.815215 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6cc6\" (UniqueName: \"kubernetes.io/projected/5b6e98b1-1261-48fc-9b61-03da6381385a-kube-api-access-t6cc6\") pod \"5b6e98b1-1261-48fc-9b61-03da6381385a\" (UID: \"5b6e98b1-1261-48fc-9b61-03da6381385a\") " Jan 10 17:23:28 crc kubenswrapper[5036]: I0110 17:23:28.815071 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5b6e98b1-1261-48fc-9b61-03da6381385a-host" (OuterVolumeSpecName: "host") pod "5b6e98b1-1261-48fc-9b61-03da6381385a" (UID: "5b6e98b1-1261-48fc-9b61-03da6381385a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 17:23:28 crc kubenswrapper[5036]: I0110 17:23:28.815940 5036 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b6e98b1-1261-48fc-9b61-03da6381385a-host\") on node \"crc\" DevicePath \"\"" Jan 10 17:23:28 crc kubenswrapper[5036]: I0110 17:23:28.822849 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b6e98b1-1261-48fc-9b61-03da6381385a-kube-api-access-t6cc6" (OuterVolumeSpecName: "kube-api-access-t6cc6") pod "5b6e98b1-1261-48fc-9b61-03da6381385a" (UID: "5b6e98b1-1261-48fc-9b61-03da6381385a"). InnerVolumeSpecName "kube-api-access-t6cc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:23:28 crc kubenswrapper[5036]: I0110 17:23:28.918464 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6cc6\" (UniqueName: \"kubernetes.io/projected/5b6e98b1-1261-48fc-9b61-03da6381385a-kube-api-access-t6cc6\") on node \"crc\" DevicePath \"\"" Jan 10 17:23:29 crc kubenswrapper[5036]: I0110 17:23:29.601973 5036 scope.go:117] "RemoveContainer" containerID="52d91b26810e7ffcb8e33003997c78f92c2a1bd7cba613ce09fd2182d29bc267" Jan 10 17:23:29 crc kubenswrapper[5036]: I0110 17:23:29.602054 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdbxc/crc-debug-59swz" Jan 10 17:23:30 crc kubenswrapper[5036]: I0110 17:23:30.521937 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b6e98b1-1261-48fc-9b61-03da6381385a" path="/var/lib/kubelet/pods/5b6e98b1-1261-48fc-9b61-03da6381385a/volumes" Jan 10 17:23:41 crc kubenswrapper[5036]: I0110 17:23:41.508701 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:23:41 crc kubenswrapper[5036]: E0110 17:23:41.509407 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:23:41 crc kubenswrapper[5036]: I0110 17:23:41.671382 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dnbcp"] Jan 10 17:23:41 crc kubenswrapper[5036]: E0110 17:23:41.673131 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b6e98b1-1261-48fc-9b61-03da6381385a" containerName="container-00" Jan 10 17:23:41 crc kubenswrapper[5036]: I0110 17:23:41.673158 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b6e98b1-1261-48fc-9b61-03da6381385a" containerName="container-00" Jan 10 17:23:41 crc kubenswrapper[5036]: I0110 17:23:41.673442 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b6e98b1-1261-48fc-9b61-03da6381385a" containerName="container-00" Jan 10 17:23:41 crc kubenswrapper[5036]: I0110 17:23:41.675233 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dnbcp" Jan 10 17:23:41 crc kubenswrapper[5036]: I0110 17:23:41.691143 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dnbcp"] Jan 10 17:23:41 crc kubenswrapper[5036]: I0110 17:23:41.841799 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bfa0788-b3fd-4389-b6e6-2ae506c37306-catalog-content\") pod \"redhat-marketplace-dnbcp\" (UID: \"9bfa0788-b3fd-4389-b6e6-2ae506c37306\") " pod="openshift-marketplace/redhat-marketplace-dnbcp" Jan 10 17:23:41 crc kubenswrapper[5036]: I0110 17:23:41.841889 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxzv6\" (UniqueName: \"kubernetes.io/projected/9bfa0788-b3fd-4389-b6e6-2ae506c37306-kube-api-access-vxzv6\") pod \"redhat-marketplace-dnbcp\" (UID: \"9bfa0788-b3fd-4389-b6e6-2ae506c37306\") " pod="openshift-marketplace/redhat-marketplace-dnbcp" Jan 10 17:23:41 crc kubenswrapper[5036]: I0110 17:23:41.842064 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bfa0788-b3fd-4389-b6e6-2ae506c37306-utilities\") pod \"redhat-marketplace-dnbcp\" (UID: \"9bfa0788-b3fd-4389-b6e6-2ae506c37306\") " pod="openshift-marketplace/redhat-marketplace-dnbcp" Jan 10 17:23:41 crc kubenswrapper[5036]: I0110 17:23:41.866126 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fhqxn"] Jan 10 17:23:41 crc kubenswrapper[5036]: I0110 17:23:41.868507 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fhqxn" Jan 10 17:23:41 crc kubenswrapper[5036]: I0110 17:23:41.893318 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fhqxn"] Jan 10 17:23:41 crc kubenswrapper[5036]: I0110 17:23:41.943475 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd6ch\" (UniqueName: \"kubernetes.io/projected/ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160-kube-api-access-pd6ch\") pod \"redhat-operators-fhqxn\" (UID: \"ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160\") " pod="openshift-marketplace/redhat-operators-fhqxn" Jan 10 17:23:41 crc kubenswrapper[5036]: I0110 17:23:41.943591 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bfa0788-b3fd-4389-b6e6-2ae506c37306-catalog-content\") pod \"redhat-marketplace-dnbcp\" (UID: \"9bfa0788-b3fd-4389-b6e6-2ae506c37306\") " pod="openshift-marketplace/redhat-marketplace-dnbcp" Jan 10 17:23:41 crc kubenswrapper[5036]: I0110 17:23:41.943632 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160-catalog-content\") pod \"redhat-operators-fhqxn\" (UID: \"ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160\") " pod="openshift-marketplace/redhat-operators-fhqxn" Jan 10 17:23:41 crc kubenswrapper[5036]: I0110 17:23:41.943745 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxzv6\" (UniqueName: \"kubernetes.io/projected/9bfa0788-b3fd-4389-b6e6-2ae506c37306-kube-api-access-vxzv6\") pod \"redhat-marketplace-dnbcp\" (UID: \"9bfa0788-b3fd-4389-b6e6-2ae506c37306\") " pod="openshift-marketplace/redhat-marketplace-dnbcp" Jan 10 17:23:41 crc kubenswrapper[5036]: I0110 17:23:41.943942 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bfa0788-b3fd-4389-b6e6-2ae506c37306-utilities\") pod \"redhat-marketplace-dnbcp\" (UID: \"9bfa0788-b3fd-4389-b6e6-2ae506c37306\") " pod="openshift-marketplace/redhat-marketplace-dnbcp" Jan 10 17:23:41 crc kubenswrapper[5036]: I0110 17:23:41.943966 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160-utilities\") pod \"redhat-operators-fhqxn\" (UID: \"ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160\") " pod="openshift-marketplace/redhat-operators-fhqxn" Jan 10 17:23:41 crc kubenswrapper[5036]: I0110 17:23:41.944190 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bfa0788-b3fd-4389-b6e6-2ae506c37306-catalog-content\") pod \"redhat-marketplace-dnbcp\" (UID: \"9bfa0788-b3fd-4389-b6e6-2ae506c37306\") " pod="openshift-marketplace/redhat-marketplace-dnbcp" Jan 10 17:23:41 crc kubenswrapper[5036]: I0110 17:23:41.944400 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bfa0788-b3fd-4389-b6e6-2ae506c37306-utilities\") pod \"redhat-marketplace-dnbcp\" (UID: \"9bfa0788-b3fd-4389-b6e6-2ae506c37306\") " pod="openshift-marketplace/redhat-marketplace-dnbcp" Jan 10 17:23:41 crc kubenswrapper[5036]: I0110 17:23:41.980802 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxzv6\" (UniqueName: \"kubernetes.io/projected/9bfa0788-b3fd-4389-b6e6-2ae506c37306-kube-api-access-vxzv6\") pod \"redhat-marketplace-dnbcp\" (UID: \"9bfa0788-b3fd-4389-b6e6-2ae506c37306\") " pod="openshift-marketplace/redhat-marketplace-dnbcp" Jan 10 17:23:41 crc kubenswrapper[5036]: I0110 17:23:41.997141 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dnbcp" Jan 10 17:23:42 crc kubenswrapper[5036]: I0110 17:23:42.058047 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160-utilities\") pod \"redhat-operators-fhqxn\" (UID: \"ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160\") " pod="openshift-marketplace/redhat-operators-fhqxn" Jan 10 17:23:42 crc kubenswrapper[5036]: I0110 17:23:42.058151 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd6ch\" (UniqueName: \"kubernetes.io/projected/ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160-kube-api-access-pd6ch\") pod \"redhat-operators-fhqxn\" (UID: \"ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160\") " pod="openshift-marketplace/redhat-operators-fhqxn" Jan 10 17:23:42 crc kubenswrapper[5036]: I0110 17:23:42.058219 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160-catalog-content\") pod \"redhat-operators-fhqxn\" (UID: \"ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160\") " pod="openshift-marketplace/redhat-operators-fhqxn" Jan 10 17:23:42 crc kubenswrapper[5036]: I0110 17:23:42.058597 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160-utilities\") pod \"redhat-operators-fhqxn\" (UID: \"ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160\") " pod="openshift-marketplace/redhat-operators-fhqxn" Jan 10 17:23:42 crc kubenswrapper[5036]: I0110 17:23:42.060983 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160-catalog-content\") pod \"redhat-operators-fhqxn\" (UID: \"ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160\") " pod="openshift-marketplace/redhat-operators-fhqxn" Jan 10 17:23:42 crc kubenswrapper[5036]: I0110 17:23:42.077208 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd6ch\" (UniqueName: \"kubernetes.io/projected/ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160-kube-api-access-pd6ch\") pod \"redhat-operators-fhqxn\" (UID: \"ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160\") " pod="openshift-marketplace/redhat-operators-fhqxn" Jan 10 17:23:42 crc kubenswrapper[5036]: I0110 17:23:42.184150 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fhqxn" Jan 10 17:23:42 crc kubenswrapper[5036]: I0110 17:23:42.538888 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dnbcp"] Jan 10 17:23:42 crc kubenswrapper[5036]: I0110 17:23:42.775075 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnbcp" event={"ID":"9bfa0788-b3fd-4389-b6e6-2ae506c37306","Type":"ContainerStarted","Data":"4ad02e9e715a5fbde5d5d9c777b71125daf25180c36c3c739866a773ed34eaf1"} Jan 10 17:23:42 crc kubenswrapper[5036]: I0110 17:23:42.815555 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fhqxn"] Jan 10 17:23:42 crc kubenswrapper[5036]: E0110 17:23:42.999751 5036 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bfa0788_b3fd_4389_b6e6_2ae506c37306.slice/crio-conmon-4273c1e325de2c866698c8064dd93a646d225a3fc034c2922ba76ebd03fce2c8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bfa0788_b3fd_4389_b6e6_2ae506c37306.slice/crio-4273c1e325de2c866698c8064dd93a646d225a3fc034c2922ba76ebd03fce2c8.scope\": RecentStats: unable to find data in memory cache]" Jan 10 17:23:43 crc kubenswrapper[5036]: I0110 17:23:43.783650 5036 generic.go:334] "Generic (PLEG): container finished" podID="9bfa0788-b3fd-4389-b6e6-2ae506c37306" containerID="4273c1e325de2c866698c8064dd93a646d225a3fc034c2922ba76ebd03fce2c8" exitCode=0 Jan 10 17:23:43 crc kubenswrapper[5036]: I0110 17:23:43.783733 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnbcp" event={"ID":"9bfa0788-b3fd-4389-b6e6-2ae506c37306","Type":"ContainerDied","Data":"4273c1e325de2c866698c8064dd93a646d225a3fc034c2922ba76ebd03fce2c8"} Jan 10 17:23:43 crc kubenswrapper[5036]: I0110 17:23:43.785734 5036 generic.go:334] "Generic (PLEG): container finished" podID="ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160" containerID="0bf79d1f31694922f34f290ed08c2c2c0665fc92634122a3747fc062f6fa1049" exitCode=0 Jan 10 17:23:43 crc kubenswrapper[5036]: I0110 17:23:43.785773 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhqxn" event={"ID":"ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160","Type":"ContainerDied","Data":"0bf79d1f31694922f34f290ed08c2c2c0665fc92634122a3747fc062f6fa1049"} Jan 10 17:23:43 crc kubenswrapper[5036]: I0110 17:23:43.785799 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhqxn" event={"ID":"ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160","Type":"ContainerStarted","Data":"37d9739685f68578f3e182cf2cd13cf637b9a0fd6f88a638ebcffc6c504f8673"} Jan 10 17:23:44 crc kubenswrapper[5036]: I0110 17:23:44.795887 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnbcp" event={"ID":"9bfa0788-b3fd-4389-b6e6-2ae506c37306","Type":"ContainerStarted","Data":"4ec36266f4ce9c554ccb17bbaa76b95d84ae9049f116b2e1f165a3b365f7d042"} Jan 10 17:23:44 crc kubenswrapper[5036]: I0110 17:23:44.798772 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhqxn" event={"ID":"ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160","Type":"ContainerStarted","Data":"1d37c7421ce1d5fd6489d2e5e5dc57c0068f54287bdf368f0166afb31f42fc41"} Jan 10 17:23:45 crc kubenswrapper[5036]: I0110 17:23:45.808182 5036 generic.go:334] "Generic (PLEG): container finished" podID="9bfa0788-b3fd-4389-b6e6-2ae506c37306" containerID="4ec36266f4ce9c554ccb17bbaa76b95d84ae9049f116b2e1f165a3b365f7d042" exitCode=0 Jan 10 17:23:45 crc kubenswrapper[5036]: I0110 17:23:45.808281 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnbcp" event={"ID":"9bfa0788-b3fd-4389-b6e6-2ae506c37306","Type":"ContainerDied","Data":"4ec36266f4ce9c554ccb17bbaa76b95d84ae9049f116b2e1f165a3b365f7d042"} Jan 10 17:23:47 crc kubenswrapper[5036]: I0110 17:23:47.830383 5036 generic.go:334] "Generic (PLEG): container finished" podID="ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160" containerID="1d37c7421ce1d5fd6489d2e5e5dc57c0068f54287bdf368f0166afb31f42fc41" exitCode=0 Jan 10 17:23:47 crc kubenswrapper[5036]: I0110 17:23:47.830484 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhqxn" event={"ID":"ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160","Type":"ContainerDied","Data":"1d37c7421ce1d5fd6489d2e5e5dc57c0068f54287bdf368f0166afb31f42fc41"} Jan 10 17:23:48 crc kubenswrapper[5036]: I0110 17:23:48.843314 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnbcp" event={"ID":"9bfa0788-b3fd-4389-b6e6-2ae506c37306","Type":"ContainerStarted","Data":"1404b6dd3a7cbd2bc8c0e1f600af105cf6fe9d9246052a09665e1cb86a5ffc90"} Jan 10 17:23:48 crc kubenswrapper[5036]: I0110 17:23:48.848195 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhqxn" event={"ID":"ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160","Type":"ContainerStarted","Data":"7d7601f46002837f61af164fd66598b1b27c935f0b9c5fe1f798695d99bbd0f5"} Jan 10 17:23:48 crc kubenswrapper[5036]: I0110 17:23:48.875914 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dnbcp" podStartSLOduration=3.45563291 podStartE2EDuration="7.875894619s" podCreationTimestamp="2026-01-10 17:23:41 +0000 UTC" firstStartedPulling="2026-01-10 17:23:43.787568768 +0000 UTC m=+3345.657804262" lastFinishedPulling="2026-01-10 17:23:48.207830447 +0000 UTC m=+3350.078065971" observedRunningTime="2026-01-10 17:23:48.868117099 +0000 UTC m=+3350.738352593" watchObservedRunningTime="2026-01-10 17:23:48.875894619 +0000 UTC m=+3350.746130113" Jan 10 17:23:51 crc kubenswrapper[5036]: I0110 17:23:51.998006 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dnbcp" Jan 10 17:23:51 crc kubenswrapper[5036]: I0110 17:23:51.998278 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dnbcp" Jan 10 17:23:52 crc kubenswrapper[5036]: I0110 17:23:52.042811 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dnbcp" Jan 10 17:23:52 crc kubenswrapper[5036]: I0110 17:23:52.076791 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fhqxn" podStartSLOduration=6.482407019 podStartE2EDuration="11.07676407s" podCreationTimestamp="2026-01-10 17:23:41 +0000 UTC" firstStartedPulling="2026-01-10 17:23:43.78731493 +0000 UTC m=+3345.657550424" lastFinishedPulling="2026-01-10 17:23:48.381671981 +0000 UTC m=+3350.251907475" observedRunningTime="2026-01-10 17:23:48.907921616 +0000 UTC m=+3350.778157130" watchObservedRunningTime="2026-01-10 17:23:52.07676407 +0000 UTC m=+3353.946999564" Jan 10 17:23:52 crc kubenswrapper[5036]: I0110 17:23:52.185578 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fhqxn" Jan 10 17:23:52 crc kubenswrapper[5036]: I0110 17:23:52.185623 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fhqxn" Jan 10 17:23:53 crc kubenswrapper[5036]: I0110 17:23:53.269695 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fhqxn" podUID="ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160" containerName="registry-server" probeResult="failure" output=< Jan 10 17:23:53 crc kubenswrapper[5036]: timeout: failed to connect service ":50051" within 1s Jan 10 17:23:53 crc kubenswrapper[5036]: > Jan 10 17:23:53 crc kubenswrapper[5036]: I0110 17:23:53.508872 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:23:53 crc kubenswrapper[5036]: E0110 17:23:53.509228 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:24:02 crc kubenswrapper[5036]: I0110 17:24:02.055237 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dnbcp" Jan 10 17:24:02 crc kubenswrapper[5036]: I0110 17:24:02.123729 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dnbcp"] Jan 10 17:24:02 crc kubenswrapper[5036]: I0110 17:24:02.249802 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fhqxn" Jan 10 17:24:02 crc kubenswrapper[5036]: I0110 17:24:02.326293 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fhqxn" Jan 10 17:24:02 crc kubenswrapper[5036]: I0110 17:24:02.992523 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dnbcp" podUID="9bfa0788-b3fd-4389-b6e6-2ae506c37306" containerName="registry-server" containerID="cri-o://1404b6dd3a7cbd2bc8c0e1f600af105cf6fe9d9246052a09665e1cb86a5ffc90" gracePeriod=2 Jan 10 17:24:03 crc kubenswrapper[5036]: I0110 17:24:03.476348 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dnbcp" Jan 10 17:24:03 crc kubenswrapper[5036]: I0110 17:24:03.509799 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxzv6\" (UniqueName: \"kubernetes.io/projected/9bfa0788-b3fd-4389-b6e6-2ae506c37306-kube-api-access-vxzv6\") pod \"9bfa0788-b3fd-4389-b6e6-2ae506c37306\" (UID: \"9bfa0788-b3fd-4389-b6e6-2ae506c37306\") " Jan 10 17:24:03 crc kubenswrapper[5036]: I0110 17:24:03.509971 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bfa0788-b3fd-4389-b6e6-2ae506c37306-utilities\") pod \"9bfa0788-b3fd-4389-b6e6-2ae506c37306\" (UID: \"9bfa0788-b3fd-4389-b6e6-2ae506c37306\") " Jan 10 17:24:03 crc kubenswrapper[5036]: I0110 17:24:03.510065 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bfa0788-b3fd-4389-b6e6-2ae506c37306-catalog-content\") pod \"9bfa0788-b3fd-4389-b6e6-2ae506c37306\" (UID: \"9bfa0788-b3fd-4389-b6e6-2ae506c37306\") " Jan 10 17:24:03 crc kubenswrapper[5036]: I0110 17:24:03.510629 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bfa0788-b3fd-4389-b6e6-2ae506c37306-utilities" (OuterVolumeSpecName: "utilities") pod "9bfa0788-b3fd-4389-b6e6-2ae506c37306" (UID: "9bfa0788-b3fd-4389-b6e6-2ae506c37306"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:24:03 crc kubenswrapper[5036]: I0110 17:24:03.514987 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bfa0788-b3fd-4389-b6e6-2ae506c37306-kube-api-access-vxzv6" (OuterVolumeSpecName: "kube-api-access-vxzv6") pod "9bfa0788-b3fd-4389-b6e6-2ae506c37306" (UID: "9bfa0788-b3fd-4389-b6e6-2ae506c37306"). InnerVolumeSpecName "kube-api-access-vxzv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:24:03 crc kubenswrapper[5036]: I0110 17:24:03.540212 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bfa0788-b3fd-4389-b6e6-2ae506c37306-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9bfa0788-b3fd-4389-b6e6-2ae506c37306" (UID: "9bfa0788-b3fd-4389-b6e6-2ae506c37306"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:24:03 crc kubenswrapper[5036]: I0110 17:24:03.614372 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bfa0788-b3fd-4389-b6e6-2ae506c37306-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 17:24:03 crc kubenswrapper[5036]: I0110 17:24:03.614405 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bfa0788-b3fd-4389-b6e6-2ae506c37306-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 17:24:03 crc kubenswrapper[5036]: I0110 17:24:03.614417 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxzv6\" (UniqueName: \"kubernetes.io/projected/9bfa0788-b3fd-4389-b6e6-2ae506c37306-kube-api-access-vxzv6\") on node \"crc\" DevicePath \"\"" Jan 10 17:24:03 crc kubenswrapper[5036]: I0110 17:24:03.907823 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fhqxn"] Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.010338 5036 generic.go:334] "Generic (PLEG): container finished" podID="9bfa0788-b3fd-4389-b6e6-2ae506c37306" containerID="1404b6dd3a7cbd2bc8c0e1f600af105cf6fe9d9246052a09665e1cb86a5ffc90" exitCode=0 Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.010398 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dnbcp" Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.010430 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnbcp" event={"ID":"9bfa0788-b3fd-4389-b6e6-2ae506c37306","Type":"ContainerDied","Data":"1404b6dd3a7cbd2bc8c0e1f600af105cf6fe9d9246052a09665e1cb86a5ffc90"} Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.010471 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnbcp" event={"ID":"9bfa0788-b3fd-4389-b6e6-2ae506c37306","Type":"ContainerDied","Data":"4ad02e9e715a5fbde5d5d9c777b71125daf25180c36c3c739866a773ed34eaf1"} Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.010487 5036 scope.go:117] "RemoveContainer" containerID="1404b6dd3a7cbd2bc8c0e1f600af105cf6fe9d9246052a09665e1cb86a5ffc90" Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.010866 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fhqxn" podUID="ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160" containerName="registry-server" containerID="cri-o://7d7601f46002837f61af164fd66598b1b27c935f0b9c5fe1f798695d99bbd0f5" gracePeriod=2 Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.032907 5036 scope.go:117] "RemoveContainer" containerID="4ec36266f4ce9c554ccb17bbaa76b95d84ae9049f116b2e1f165a3b365f7d042" Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.053227 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dnbcp"] Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.065452 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dnbcp"] Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.074158 5036 scope.go:117] "RemoveContainer" containerID="4273c1e325de2c866698c8064dd93a646d225a3fc034c2922ba76ebd03fce2c8" Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.241806 5036 scope.go:117] "RemoveContainer" containerID="1404b6dd3a7cbd2bc8c0e1f600af105cf6fe9d9246052a09665e1cb86a5ffc90" Jan 10 17:24:04 crc kubenswrapper[5036]: E0110 17:24:04.243076 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1404b6dd3a7cbd2bc8c0e1f600af105cf6fe9d9246052a09665e1cb86a5ffc90\": container with ID starting with 1404b6dd3a7cbd2bc8c0e1f600af105cf6fe9d9246052a09665e1cb86a5ffc90 not found: ID does not exist" containerID="1404b6dd3a7cbd2bc8c0e1f600af105cf6fe9d9246052a09665e1cb86a5ffc90" Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.243104 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1404b6dd3a7cbd2bc8c0e1f600af105cf6fe9d9246052a09665e1cb86a5ffc90"} err="failed to get container status \"1404b6dd3a7cbd2bc8c0e1f600af105cf6fe9d9246052a09665e1cb86a5ffc90\": rpc error: code = NotFound desc = could not find container \"1404b6dd3a7cbd2bc8c0e1f600af105cf6fe9d9246052a09665e1cb86a5ffc90\": container with ID starting with 1404b6dd3a7cbd2bc8c0e1f600af105cf6fe9d9246052a09665e1cb86a5ffc90 not found: ID does not exist" Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.243123 5036 scope.go:117] "RemoveContainer" containerID="4ec36266f4ce9c554ccb17bbaa76b95d84ae9049f116b2e1f165a3b365f7d042" Jan 10 17:24:04 crc kubenswrapper[5036]: E0110 17:24:04.243462 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ec36266f4ce9c554ccb17bbaa76b95d84ae9049f116b2e1f165a3b365f7d042\": container with ID starting with 4ec36266f4ce9c554ccb17bbaa76b95d84ae9049f116b2e1f165a3b365f7d042 not found: ID does not exist" containerID="4ec36266f4ce9c554ccb17bbaa76b95d84ae9049f116b2e1f165a3b365f7d042" Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.243528 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec36266f4ce9c554ccb17bbaa76b95d84ae9049f116b2e1f165a3b365f7d042"} err="failed to get container status \"4ec36266f4ce9c554ccb17bbaa76b95d84ae9049f116b2e1f165a3b365f7d042\": rpc error: code = NotFound desc = could not find container \"4ec36266f4ce9c554ccb17bbaa76b95d84ae9049f116b2e1f165a3b365f7d042\": container with ID starting with 4ec36266f4ce9c554ccb17bbaa76b95d84ae9049f116b2e1f165a3b365f7d042 not found: ID does not exist" Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.243546 5036 scope.go:117] "RemoveContainer" containerID="4273c1e325de2c866698c8064dd93a646d225a3fc034c2922ba76ebd03fce2c8" Jan 10 17:24:04 crc kubenswrapper[5036]: E0110 17:24:04.243955 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4273c1e325de2c866698c8064dd93a646d225a3fc034c2922ba76ebd03fce2c8\": container with ID starting with 4273c1e325de2c866698c8064dd93a646d225a3fc034c2922ba76ebd03fce2c8 not found: ID does not exist" containerID="4273c1e325de2c866698c8064dd93a646d225a3fc034c2922ba76ebd03fce2c8" Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.243981 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4273c1e325de2c866698c8064dd93a646d225a3fc034c2922ba76ebd03fce2c8"} err="failed to get container status \"4273c1e325de2c866698c8064dd93a646d225a3fc034c2922ba76ebd03fce2c8\": rpc error: code = NotFound desc = could not find container \"4273c1e325de2c866698c8064dd93a646d225a3fc034c2922ba76ebd03fce2c8\": container with ID starting with 4273c1e325de2c866698c8064dd93a646d225a3fc034c2922ba76ebd03fce2c8 not found: ID does not exist" Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.465131 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fhqxn" Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.517095 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bfa0788-b3fd-4389-b6e6-2ae506c37306" path="/var/lib/kubelet/pods/9bfa0788-b3fd-4389-b6e6-2ae506c37306/volumes" Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.634471 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd6ch\" (UniqueName: \"kubernetes.io/projected/ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160-kube-api-access-pd6ch\") pod \"ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160\" (UID: \"ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160\") " Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.634566 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160-utilities\") pod \"ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160\" (UID: \"ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160\") " Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.634660 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160-catalog-content\") pod \"ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160\" (UID: \"ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160\") " Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.637019 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160-utilities" (OuterVolumeSpecName: "utilities") pod "ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160" (UID: "ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.639655 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160-kube-api-access-pd6ch" (OuterVolumeSpecName: "kube-api-access-pd6ch") pod "ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160" (UID: "ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160"). InnerVolumeSpecName "kube-api-access-pd6ch". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.736717 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd6ch\" (UniqueName: \"kubernetes.io/projected/ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160-kube-api-access-pd6ch\") on node \"crc\" DevicePath \"\"" Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.736757 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.784475 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160" (UID: "ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:24:04 crc kubenswrapper[5036]: I0110 17:24:04.839286 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 17:24:05 crc kubenswrapper[5036]: I0110 17:24:05.027438 5036 generic.go:334] "Generic (PLEG): container finished" podID="ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160" containerID="7d7601f46002837f61af164fd66598b1b27c935f0b9c5fe1f798695d99bbd0f5" exitCode=0 Jan 10 17:24:05 crc kubenswrapper[5036]: I0110 17:24:05.027478 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhqxn" event={"ID":"ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160","Type":"ContainerDied","Data":"7d7601f46002837f61af164fd66598b1b27c935f0b9c5fe1f798695d99bbd0f5"} Jan 10 17:24:05 crc kubenswrapper[5036]: I0110 17:24:05.027510 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhqxn" event={"ID":"ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160","Type":"ContainerDied","Data":"37d9739685f68578f3e182cf2cd13cf637b9a0fd6f88a638ebcffc6c504f8673"} Jan 10 17:24:05 crc kubenswrapper[5036]: I0110 17:24:05.027517 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fhqxn" Jan 10 17:24:05 crc kubenswrapper[5036]: I0110 17:24:05.027533 5036 scope.go:117] "RemoveContainer" containerID="7d7601f46002837f61af164fd66598b1b27c935f0b9c5fe1f798695d99bbd0f5" Jan 10 17:24:05 crc kubenswrapper[5036]: I0110 17:24:05.067123 5036 scope.go:117] "RemoveContainer" containerID="1d37c7421ce1d5fd6489d2e5e5dc57c0068f54287bdf368f0166afb31f42fc41" Jan 10 17:24:05 crc kubenswrapper[5036]: I0110 17:24:05.097085 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fhqxn"] Jan 10 17:24:05 crc kubenswrapper[5036]: I0110 17:24:05.104714 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fhqxn"] Jan 10 17:24:05 crc kubenswrapper[5036]: I0110 17:24:05.105487 5036 scope.go:117] "RemoveContainer" containerID="0bf79d1f31694922f34f290ed08c2c2c0665fc92634122a3747fc062f6fa1049" Jan 10 17:24:05 crc kubenswrapper[5036]: I0110 17:24:05.122103 5036 scope.go:117] "RemoveContainer" containerID="7d7601f46002837f61af164fd66598b1b27c935f0b9c5fe1f798695d99bbd0f5" Jan 10 17:24:05 crc kubenswrapper[5036]: E0110 17:24:05.122526 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d7601f46002837f61af164fd66598b1b27c935f0b9c5fe1f798695d99bbd0f5\": container with ID starting with 7d7601f46002837f61af164fd66598b1b27c935f0b9c5fe1f798695d99bbd0f5 not found: ID does not exist" containerID="7d7601f46002837f61af164fd66598b1b27c935f0b9c5fe1f798695d99bbd0f5" Jan 10 17:24:05 crc kubenswrapper[5036]: I0110 17:24:05.122571 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d7601f46002837f61af164fd66598b1b27c935f0b9c5fe1f798695d99bbd0f5"} err="failed to get container status \"7d7601f46002837f61af164fd66598b1b27c935f0b9c5fe1f798695d99bbd0f5\": rpc error: code = NotFound desc = could not find container \"7d7601f46002837f61af164fd66598b1b27c935f0b9c5fe1f798695d99bbd0f5\": container with ID starting with 7d7601f46002837f61af164fd66598b1b27c935f0b9c5fe1f798695d99bbd0f5 not found: ID does not exist" Jan 10 17:24:05 crc kubenswrapper[5036]: I0110 17:24:05.122605 5036 scope.go:117] "RemoveContainer" containerID="1d37c7421ce1d5fd6489d2e5e5dc57c0068f54287bdf368f0166afb31f42fc41" Jan 10 17:24:05 crc kubenswrapper[5036]: E0110 17:24:05.124549 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d37c7421ce1d5fd6489d2e5e5dc57c0068f54287bdf368f0166afb31f42fc41\": container with ID starting with 1d37c7421ce1d5fd6489d2e5e5dc57c0068f54287bdf368f0166afb31f42fc41 not found: ID does not exist" containerID="1d37c7421ce1d5fd6489d2e5e5dc57c0068f54287bdf368f0166afb31f42fc41" Jan 10 17:24:05 crc kubenswrapper[5036]: I0110 17:24:05.124600 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d37c7421ce1d5fd6489d2e5e5dc57c0068f54287bdf368f0166afb31f42fc41"} err="failed to get container status \"1d37c7421ce1d5fd6489d2e5e5dc57c0068f54287bdf368f0166afb31f42fc41\": rpc error: code = NotFound desc = could not find container \"1d37c7421ce1d5fd6489d2e5e5dc57c0068f54287bdf368f0166afb31f42fc41\": container with ID starting with 1d37c7421ce1d5fd6489d2e5e5dc57c0068f54287bdf368f0166afb31f42fc41 not found: ID does not exist" Jan 10 17:24:05 crc kubenswrapper[5036]: I0110 17:24:05.124633 5036 scope.go:117] "RemoveContainer" containerID="0bf79d1f31694922f34f290ed08c2c2c0665fc92634122a3747fc062f6fa1049" Jan 10 17:24:05 crc kubenswrapper[5036]: E0110 17:24:05.125057 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bf79d1f31694922f34f290ed08c2c2c0665fc92634122a3747fc062f6fa1049\": container with ID starting with 0bf79d1f31694922f34f290ed08c2c2c0665fc92634122a3747fc062f6fa1049 not found: ID does not exist" containerID="0bf79d1f31694922f34f290ed08c2c2c0665fc92634122a3747fc062f6fa1049" Jan 10 17:24:05 crc kubenswrapper[5036]: I0110 17:24:05.125095 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bf79d1f31694922f34f290ed08c2c2c0665fc92634122a3747fc062f6fa1049"} err="failed to get container status \"0bf79d1f31694922f34f290ed08c2c2c0665fc92634122a3747fc062f6fa1049\": rpc error: code = NotFound desc = could not find container \"0bf79d1f31694922f34f290ed08c2c2c0665fc92634122a3747fc062f6fa1049\": container with ID starting with 0bf79d1f31694922f34f290ed08c2c2c0665fc92634122a3747fc062f6fa1049 not found: ID does not exist" Jan 10 17:24:06 crc kubenswrapper[5036]: I0110 17:24:06.539424 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160" path="/var/lib/kubelet/pods/ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160/volumes" Jan 10 17:24:07 crc kubenswrapper[5036]: I0110 17:24:07.508479 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:24:07 crc kubenswrapper[5036]: E0110 17:24:07.508883 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:24:18 crc kubenswrapper[5036]: I0110 17:24:18.514468 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:24:18 crc kubenswrapper[5036]: E0110 17:24:18.516217 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:24:30 crc kubenswrapper[5036]: I0110 17:24:30.431822 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7655587964-dzfxf_a96677c4-c2f0-4fba-bcb0-a657dfdd1f41/barbican-api/0.log" Jan 10 17:24:30 crc kubenswrapper[5036]: I0110 17:24:30.578843 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7655587964-dzfxf_a96677c4-c2f0-4fba-bcb0-a657dfdd1f41/barbican-api-log/0.log" Jan 10 17:24:30 crc kubenswrapper[5036]: I0110 17:24:30.652277 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-77cbb79454-h7btf_731670b8-d6af-49c5-b8cf-ddeafb2462c7/barbican-keystone-listener/0.log" Jan 10 17:24:30 crc kubenswrapper[5036]: I0110 17:24:30.737376 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-77cbb79454-h7btf_731670b8-d6af-49c5-b8cf-ddeafb2462c7/barbican-keystone-listener-log/0.log" Jan 10 17:24:30 crc kubenswrapper[5036]: I0110 17:24:30.870339 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-55c7665d4c-brkx9_608bfa08-ff8b-4f06-bc62-e456f9e2005c/barbican-worker/0.log" Jan 10 17:24:30 crc kubenswrapper[5036]: I0110 17:24:30.876305 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-55c7665d4c-brkx9_608bfa08-ff8b-4f06-bc62-e456f9e2005c/barbican-worker-log/0.log" Jan 10 17:24:31 crc kubenswrapper[5036]: I0110 17:24:31.044859 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz_d9f0ccdb-1434-4bd0-90e1-d9314c8d716f/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:24:31 crc kubenswrapper[5036]: I0110 17:24:31.110165 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_eaeec74d-5c59-4684-81e3-7ca32b833f59/ceilometer-central-agent/0.log" Jan 10 17:24:31 crc kubenswrapper[5036]: I0110 17:24:31.183346 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_eaeec74d-5c59-4684-81e3-7ca32b833f59/proxy-httpd/0.log" Jan 10 17:24:31 crc kubenswrapper[5036]: I0110 17:24:31.205021 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_eaeec74d-5c59-4684-81e3-7ca32b833f59/ceilometer-notification-agent/0.log" Jan 10 17:24:31 crc kubenswrapper[5036]: I0110 17:24:31.299521 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_eaeec74d-5c59-4684-81e3-7ca32b833f59/sg-core/0.log" Jan 10 17:24:31 crc kubenswrapper[5036]: I0110 17:24:31.388150 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr_35435ad9-1b59-46c6-b2c7-a57b43c65a3d/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:24:31 crc kubenswrapper[5036]: I0110 17:24:31.488121 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77_993c9fcb-a10b-4d08-ae74-2bc52e9d8131/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:24:31 crc kubenswrapper[5036]: I0110 17:24:31.616898 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_04bfc371-7aba-4a4d-b018-4a79ad8a0b7b/cinder-api/0.log" Jan 10 17:24:31 crc kubenswrapper[5036]: I0110 17:24:31.701108 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_04bfc371-7aba-4a4d-b018-4a79ad8a0b7b/cinder-api-log/0.log" Jan 10 17:24:31 crc kubenswrapper[5036]: I0110 17:24:31.838029 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_72df80ca-b881-4bc6-b6bc-816dccb6a4a6/probe/0.log" Jan 10 17:24:31 crc kubenswrapper[5036]: I0110 17:24:31.997454 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_db9849cf-82c8-4f9d-86f2-c7bf664528c9/cinder-scheduler/0.log" Jan 10 17:24:32 crc kubenswrapper[5036]: I0110 17:24:32.015886 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_72df80ca-b881-4bc6-b6bc-816dccb6a4a6/cinder-backup/0.log" Jan 10 17:24:32 crc kubenswrapper[5036]: I0110 17:24:32.083135 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_db9849cf-82c8-4f9d-86f2-c7bf664528c9/probe/0.log" Jan 10 17:24:32 crc kubenswrapper[5036]: I0110 17:24:32.160955 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_5e51ea81-c177-4dc1-a427-c3290a9e6010/probe/0.log" Jan 10 17:24:32 crc kubenswrapper[5036]: I0110 17:24:32.279159 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_5e51ea81-c177-4dc1-a427-c3290a9e6010/cinder-volume/0.log" Jan 10 17:24:32 crc kubenswrapper[5036]: I0110 17:24:32.361686 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4_feaba290-606b-4396-af62-f32fd6e33a53/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:24:32 crc kubenswrapper[5036]: I0110 17:24:32.465434 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-b4gng_bf597c03-b76a-445a-84d3-034d70ca102e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:24:32 crc kubenswrapper[5036]: I0110 17:24:32.508241 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:24:32 crc kubenswrapper[5036]: E0110 17:24:32.508565 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:24:32 crc kubenswrapper[5036]: I0110 17:24:32.574054 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-dd8k9_7186e5b3-1cc5-422b-8151-4a873bf08a6a/init/0.log" Jan 10 17:24:32 crc kubenswrapper[5036]: I0110 17:24:32.798396 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-dd8k9_7186e5b3-1cc5-422b-8151-4a873bf08a6a/dnsmasq-dns/0.log" Jan 10 17:24:32 crc kubenswrapper[5036]: I0110 17:24:32.801387 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-dd8k9_7186e5b3-1cc5-422b-8151-4a873bf08a6a/init/0.log" Jan 10 17:24:32 crc kubenswrapper[5036]: I0110 17:24:32.838352 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_31000160-d620-481e-8b44-98f23e2e0679/glance-httpd/0.log" Jan 10 17:24:32 crc kubenswrapper[5036]: I0110 17:24:32.957558 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_31000160-d620-481e-8b44-98f23e2e0679/glance-log/0.log" Jan 10 17:24:32 crc kubenswrapper[5036]: I0110 17:24:32.987803 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_46236d51-28af-48ad-8aff-2300b9d0155f/glance-httpd/0.log" Jan 10 17:24:33 crc kubenswrapper[5036]: I0110 17:24:33.021769 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_46236d51-28af-48ad-8aff-2300b9d0155f/glance-log/0.log" Jan 10 17:24:33 crc kubenswrapper[5036]: I0110 17:24:33.274952 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5bcc8455c4-njd4j_e92a2ceb-4619-4207-a2a3-b6c588674ab8/horizon-log/0.log" Jan 10 17:24:33 crc kubenswrapper[5036]: I0110 17:24:33.275725 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5bcc8455c4-njd4j_e92a2ceb-4619-4207-a2a3-b6c588674ab8/horizon/0.log" Jan 10 17:24:33 crc kubenswrapper[5036]: I0110 17:24:33.383427 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9_421d37b9-14cd-4270-b305-c6f946cd32a3/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:24:33 crc kubenswrapper[5036]: I0110 17:24:33.483037 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-zzhbf_7be91e0f-1820-445f-b106-0558e046ac4a/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:24:33 crc kubenswrapper[5036]: I0110 17:24:33.734643 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6f7c9c789b-dj95d_75115cba-8c6e-4c48-b71c-0277c43f446c/keystone-api/0.log" Jan 10 17:24:33 crc kubenswrapper[5036]: I0110 17:24:33.782650 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29467741-znhmh_6bcb0a70-9f58-48f3-b35d-3adf490692cb/keystone-cron/0.log" Jan 10 17:24:33 crc kubenswrapper[5036]: I0110 17:24:33.921779 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_2c6502b1-879a-46ee-a2ff-54cece3ee9e6/kube-state-metrics/0.log" Jan 10 17:24:33 crc kubenswrapper[5036]: I0110 17:24:33.984507 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6_b0c29b9c-0e82-4bbc-89af-fa26d3c4603b/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:24:34 crc kubenswrapper[5036]: I0110 17:24:34.013536 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-97e9-account-create-update-c6fxd_d460130e-a99b-46ab-b4d5-fa9528b70515/mariadb-account-create-update/0.log" Jan 10 17:24:34 crc kubenswrapper[5036]: I0110 17:24:34.167758 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_32551bcd-e5f3-445c-b4d2-d4ac138a54ce/manila-api-log/0.log" Jan 10 17:24:34 crc kubenswrapper[5036]: I0110 17:24:34.228367 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_32551bcd-e5f3-445c-b4d2-d4ac138a54ce/manila-api/0.log" Jan 10 17:24:34 crc kubenswrapper[5036]: I0110 17:24:34.303544 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-db-create-zfb2d_01a3d231-ccaa-462b-a57b-b56b4e0f2921/mariadb-database-create/0.log" Jan 10 17:24:34 crc kubenswrapper[5036]: I0110 17:24:34.375620 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-db-sync-lk5kh_e9d964e6-ac20-4cac-ad16-6461bc88fac7/manila-db-sync/0.log" Jan 10 17:24:34 crc kubenswrapper[5036]: I0110 17:24:34.504602 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_da772573-b489-4f28-85da-5d242835ae61/manila-scheduler/0.log" Jan 10 17:24:34 crc kubenswrapper[5036]: I0110 17:24:34.728509 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_da772573-b489-4f28-85da-5d242835ae61/probe/0.log" Jan 10 17:24:34 crc kubenswrapper[5036]: I0110 17:24:34.789238 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c/manila-share/0.log" Jan 10 17:24:34 crc kubenswrapper[5036]: I0110 17:24:34.815197 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c/probe/0.log" Jan 10 17:24:35 crc kubenswrapper[5036]: I0110 17:24:35.017894 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-74d5fd97c9-96pjx_ffb9a3a8-bbeb-414f-8d26-f35e51a05957/neutron-api/0.log" Jan 10 17:24:35 crc kubenswrapper[5036]: I0110 17:24:35.029379 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-74d5fd97c9-96pjx_ffb9a3a8-bbeb-414f-8d26-f35e51a05957/neutron-httpd/0.log" Jan 10 17:24:35 crc kubenswrapper[5036]: I0110 17:24:35.281605 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x_3f111a6e-f987-4636-aada-aee2793d5047/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:24:35 crc kubenswrapper[5036]: I0110 17:24:35.517164 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3cf23453-9366-4458-9e7c-af60e7ef7b83/nova-api-log/0.log" Jan 10 17:24:35 crc kubenswrapper[5036]: I0110 17:24:35.579687 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3cf23453-9366-4458-9e7c-af60e7ef7b83/nova-api-api/0.log" Jan 10 17:24:35 crc kubenswrapper[5036]: I0110 17:24:35.615008 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_e98e9d9c-f90a-44da-9b67-2dadaf5b24b3/nova-cell0-conductor-conductor/0.log" Jan 10 17:24:35 crc kubenswrapper[5036]: I0110 17:24:35.859540 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_1de05ac5-ff01-445f-b1a8-41a7db2a70c4/nova-cell1-conductor-conductor/0.log" Jan 10 17:24:35 crc kubenswrapper[5036]: I0110 17:24:35.953982 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_8c0ed0eb-87d3-43cc-bdbb-1269890e7799/nova-cell1-novncproxy-novncproxy/0.log" Jan 10 17:24:36 crc kubenswrapper[5036]: I0110 17:24:36.123738 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl_b4da8068-8e5a-4624-b65f-05da63640d19/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:24:36 crc kubenswrapper[5036]: I0110 17:24:36.197298 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f2dade9a-7926-4c9b-82df-4c525efd69db/nova-metadata-log/0.log" Jan 10 17:24:36 crc kubenswrapper[5036]: I0110 17:24:36.483528 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_50d7fbd5-136f-4138-b4de-7d0841e80688/nova-scheduler-scheduler/0.log" Jan 10 17:24:36 crc kubenswrapper[5036]: I0110 17:24:36.555276 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2/mysql-bootstrap/0.log" Jan 10 17:24:36 crc kubenswrapper[5036]: I0110 17:24:36.700480 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2/mysql-bootstrap/0.log" Jan 10 17:24:36 crc kubenswrapper[5036]: I0110 17:24:36.716021 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2/galera/0.log" Jan 10 17:24:36 crc kubenswrapper[5036]: I0110 17:24:36.898580 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3f624572-bbfe-4c9d-be6f-f8f647fd8aa2/mysql-bootstrap/0.log" Jan 10 17:24:37 crc kubenswrapper[5036]: I0110 17:24:37.066060 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3f624572-bbfe-4c9d-be6f-f8f647fd8aa2/mysql-bootstrap/0.log" Jan 10 17:24:37 crc kubenswrapper[5036]: I0110 17:24:37.080611 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3f624572-bbfe-4c9d-be6f-f8f647fd8aa2/galera/0.log" Jan 10 17:24:37 crc kubenswrapper[5036]: I0110 17:24:37.272049 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f2dade9a-7926-4c9b-82df-4c525efd69db/nova-metadata-metadata/0.log" Jan 10 17:24:37 crc kubenswrapper[5036]: I0110 17:24:37.306271 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_70cfbefa-2928-4ca5-aa74-93fb1b4cd059/openstackclient/0.log" Jan 10 17:24:37 crc kubenswrapper[5036]: I0110 17:24:37.330984 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-czqbw_be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4/ovn-controller/0.log" Jan 10 17:24:37 crc kubenswrapper[5036]: I0110 17:24:37.525995 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-jp5mj_d0f482ce-10a1-42c2-80f6-60fd28c8cc25/openstack-network-exporter/0.log" Jan 10 17:24:37 crc kubenswrapper[5036]: I0110 17:24:37.548895 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vsd6b_65d28afa-c448-4c8a-8fe9-062d9383f484/ovsdb-server-init/0.log" Jan 10 17:24:37 crc kubenswrapper[5036]: I0110 17:24:37.700434 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vsd6b_65d28afa-c448-4c8a-8fe9-062d9383f484/ovsdb-server-init/0.log" Jan 10 17:24:37 crc kubenswrapper[5036]: I0110 17:24:37.725483 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vsd6b_65d28afa-c448-4c8a-8fe9-062d9383f484/ovsdb-server/0.log" Jan 10 17:24:37 crc kubenswrapper[5036]: I0110 17:24:37.744708 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vsd6b_65d28afa-c448-4c8a-8fe9-062d9383f484/ovs-vswitchd/0.log" Jan 10 17:24:37 crc kubenswrapper[5036]: I0110 17:24:37.933123 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-skrfj_7cb46990-94ee-4a82-93a2-a30c563f1146/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:24:37 crc kubenswrapper[5036]: I0110 17:24:37.944652 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1d1aa719-1166-4afe-8263-c771aa0a25da/openstack-network-exporter/0.log" Jan 10 17:24:38 crc kubenswrapper[5036]: I0110 17:24:38.021652 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1d1aa719-1166-4afe-8263-c771aa0a25da/ovn-northd/0.log" Jan 10 17:24:38 crc kubenswrapper[5036]: I0110 17:24:38.162242 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b642befa-dd18-4984-b74f-d3945ee06f7d/openstack-network-exporter/0.log" Jan 10 17:24:38 crc kubenswrapper[5036]: I0110 17:24:38.202424 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b642befa-dd18-4984-b74f-d3945ee06f7d/ovsdbserver-nb/0.log" Jan 10 17:24:38 crc kubenswrapper[5036]: I0110 17:24:38.341558 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4f74eaf1-cd39-41dc-8c0a-170373e863e5/openstack-network-exporter/0.log" Jan 10 17:24:38 crc kubenswrapper[5036]: I0110 17:24:38.345394 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4f74eaf1-cd39-41dc-8c0a-170373e863e5/ovsdbserver-sb/0.log" Jan 10 17:24:38 crc kubenswrapper[5036]: I0110 17:24:38.535506 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6ffbbc4bd-swcjc_5b379ab6-fc59-475f-909f-4f71e7184803/placement-api/0.log" Jan 10 17:24:38 crc kubenswrapper[5036]: I0110 17:24:38.556803 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6ffbbc4bd-swcjc_5b379ab6-fc59-475f-909f-4f71e7184803/placement-log/0.log" Jan 10 17:24:38 crc kubenswrapper[5036]: I0110 17:24:38.683126 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1/setup-container/0.log" Jan 10 17:24:38 crc kubenswrapper[5036]: I0110 17:24:38.842202 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1/rabbitmq/0.log" Jan 10 17:24:38 crc kubenswrapper[5036]: I0110 17:24:38.859395 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1/setup-container/0.log" Jan 10 17:24:38 crc kubenswrapper[5036]: I0110 17:24:38.892269 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e33d0131-d1d9-42cb-b772-7fe9835cee44/setup-container/0.log" Jan 10 17:24:39 crc kubenswrapper[5036]: I0110 17:24:39.121388 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e33d0131-d1d9-42cb-b772-7fe9835cee44/rabbitmq/0.log" Jan 10 17:24:39 crc kubenswrapper[5036]: I0110 17:24:39.133998 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e33d0131-d1d9-42cb-b772-7fe9835cee44/setup-container/0.log" Jan 10 17:24:39 crc kubenswrapper[5036]: I0110 17:24:39.173506 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f_f551e3a3-cdf6-4fc6-8452-869afe1cef86/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:24:39 crc kubenswrapper[5036]: I0110 17:24:39.385509 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx_8e48f105-5183-4dd8-94d9-8a8636ca4c82/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:24:39 crc kubenswrapper[5036]: I0110 17:24:39.420281 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-mw76h_956e3be3-ef01-423c-a80d-1b6c517aee91/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:24:39 crc kubenswrapper[5036]: I0110 17:24:39.594026 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-jw6dr_35cc2e15-b6d3-419a-b719-1fcee66ce1b5/ssh-known-hosts-edpm-deployment/0.log" Jan 10 17:24:39 crc kubenswrapper[5036]: I0110 17:24:39.678807 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_d98e00b3-6224-462a-abe0-52e09ac44fb8/tempest-tests-tempest-tests-runner/0.log" Jan 10 17:24:39 crc kubenswrapper[5036]: I0110 17:24:39.764625 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_bbb64716-afc7-4c1a-be2f-2ff9cc886e96/test-operator-logs-container/0.log" Jan 10 17:24:39 crc kubenswrapper[5036]: I0110 17:24:39.907844 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg_be9c4cc3-5744-42de-809a-fcd16a407199/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:24:43 crc kubenswrapper[5036]: I0110 17:24:43.507788 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:24:43 crc kubenswrapper[5036]: E0110 17:24:43.508432 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:24:54 crc kubenswrapper[5036]: I0110 17:24:54.733206 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r68tv"] Jan 10 17:24:54 crc kubenswrapper[5036]: E0110 17:24:54.734664 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160" containerName="extract-content" Jan 10 17:24:54 crc kubenswrapper[5036]: I0110 17:24:54.734691 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160" containerName="extract-content" Jan 10 17:24:54 crc kubenswrapper[5036]: E0110 17:24:54.734700 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bfa0788-b3fd-4389-b6e6-2ae506c37306" containerName="extract-utilities" Jan 10 17:24:54 crc kubenswrapper[5036]: I0110 17:24:54.734706 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bfa0788-b3fd-4389-b6e6-2ae506c37306" containerName="extract-utilities" Jan 10 17:24:54 crc kubenswrapper[5036]: E0110 17:24:54.734733 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bfa0788-b3fd-4389-b6e6-2ae506c37306" containerName="extract-content" Jan 10 17:24:54 crc kubenswrapper[5036]: I0110 17:24:54.734739 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bfa0788-b3fd-4389-b6e6-2ae506c37306" containerName="extract-content" Jan 10 17:24:54 crc kubenswrapper[5036]: E0110 17:24:54.734748 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160" containerName="extract-utilities" Jan 10 17:24:54 crc kubenswrapper[5036]: I0110 17:24:54.734754 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160" containerName="extract-utilities" Jan 10 17:24:54 crc kubenswrapper[5036]: E0110 17:24:54.734782 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160" containerName="registry-server" Jan 10 17:24:54 crc kubenswrapper[5036]: I0110 17:24:54.734788 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160" containerName="registry-server" Jan 10 17:24:54 crc kubenswrapper[5036]: E0110 17:24:54.734820 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bfa0788-b3fd-4389-b6e6-2ae506c37306" containerName="registry-server" Jan 10 17:24:54 crc kubenswrapper[5036]: I0110 17:24:54.734826 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bfa0788-b3fd-4389-b6e6-2ae506c37306" containerName="registry-server" Jan 10 17:24:54 crc kubenswrapper[5036]: I0110 17:24:54.735156 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bfa0788-b3fd-4389-b6e6-2ae506c37306" containerName="registry-server" Jan 10 17:24:54 crc kubenswrapper[5036]: I0110 17:24:54.735183 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac19c7ae-6cb8-4fd2-92e3-05cabf5e9160" containerName="registry-server" Jan 10 17:24:54 crc kubenswrapper[5036]: I0110 17:24:54.761796 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r68tv" Jan 10 17:24:54 crc kubenswrapper[5036]: I0110 17:24:54.772399 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r68tv"] Jan 10 17:24:54 crc kubenswrapper[5036]: I0110 17:24:54.866238 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0146b60-996f-4b5c-a6ad-19d987cee70d-catalog-content\") pod \"community-operators-r68tv\" (UID: \"b0146b60-996f-4b5c-a6ad-19d987cee70d\") " pod="openshift-marketplace/community-operators-r68tv" Jan 10 17:24:54 crc kubenswrapper[5036]: I0110 17:24:54.866564 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0146b60-996f-4b5c-a6ad-19d987cee70d-utilities\") pod \"community-operators-r68tv\" (UID: \"b0146b60-996f-4b5c-a6ad-19d987cee70d\") " pod="openshift-marketplace/community-operators-r68tv" Jan 10 17:24:54 crc kubenswrapper[5036]: I0110 17:24:54.866671 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2cfn\" (UniqueName: \"kubernetes.io/projected/b0146b60-996f-4b5c-a6ad-19d987cee70d-kube-api-access-g2cfn\") pod \"community-operators-r68tv\" (UID: \"b0146b60-996f-4b5c-a6ad-19d987cee70d\") " pod="openshift-marketplace/community-operators-r68tv" Jan 10 17:24:54 crc kubenswrapper[5036]: I0110 17:24:54.967964 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0146b60-996f-4b5c-a6ad-19d987cee70d-catalog-content\") pod \"community-operators-r68tv\" (UID: \"b0146b60-996f-4b5c-a6ad-19d987cee70d\") " pod="openshift-marketplace/community-operators-r68tv" Jan 10 17:24:54 crc kubenswrapper[5036]: I0110 17:24:54.968078 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0146b60-996f-4b5c-a6ad-19d987cee70d-utilities\") pod \"community-operators-r68tv\" (UID: \"b0146b60-996f-4b5c-a6ad-19d987cee70d\") " pod="openshift-marketplace/community-operators-r68tv" Jan 10 17:24:54 crc kubenswrapper[5036]: I0110 17:24:54.968209 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2cfn\" (UniqueName: \"kubernetes.io/projected/b0146b60-996f-4b5c-a6ad-19d987cee70d-kube-api-access-g2cfn\") pod \"community-operators-r68tv\" (UID: \"b0146b60-996f-4b5c-a6ad-19d987cee70d\") " pod="openshift-marketplace/community-operators-r68tv" Jan 10 17:24:54 crc kubenswrapper[5036]: I0110 17:24:54.968994 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0146b60-996f-4b5c-a6ad-19d987cee70d-catalog-content\") pod \"community-operators-r68tv\" (UID: \"b0146b60-996f-4b5c-a6ad-19d987cee70d\") " pod="openshift-marketplace/community-operators-r68tv" Jan 10 17:24:54 crc kubenswrapper[5036]: I0110 17:24:54.969241 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0146b60-996f-4b5c-a6ad-19d987cee70d-utilities\") pod \"community-operators-r68tv\" (UID: \"b0146b60-996f-4b5c-a6ad-19d987cee70d\") " pod="openshift-marketplace/community-operators-r68tv" Jan 10 17:24:54 crc kubenswrapper[5036]: I0110 17:24:54.986865 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2cfn\" (UniqueName: \"kubernetes.io/projected/b0146b60-996f-4b5c-a6ad-19d987cee70d-kube-api-access-g2cfn\") pod \"community-operators-r68tv\" (UID: \"b0146b60-996f-4b5c-a6ad-19d987cee70d\") " pod="openshift-marketplace/community-operators-r68tv" Jan 10 17:24:55 crc kubenswrapper[5036]: I0110 17:24:55.096712 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r68tv" Jan 10 17:24:55 crc kubenswrapper[5036]: I0110 17:24:55.607256 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r68tv"] Jan 10 17:24:56 crc kubenswrapper[5036]: I0110 17:24:56.511052 5036 generic.go:334] "Generic (PLEG): container finished" podID="b0146b60-996f-4b5c-a6ad-19d987cee70d" containerID="a45d28669952dd9f4c464d5ad8c1b0650f61aa99c3dc1b1a7ea1dc13e228620e" exitCode=0 Jan 10 17:24:56 crc kubenswrapper[5036]: I0110 17:24:56.528608 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r68tv" event={"ID":"b0146b60-996f-4b5c-a6ad-19d987cee70d","Type":"ContainerDied","Data":"a45d28669952dd9f4c464d5ad8c1b0650f61aa99c3dc1b1a7ea1dc13e228620e"} Jan 10 17:24:56 crc kubenswrapper[5036]: I0110 17:24:56.528648 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r68tv" event={"ID":"b0146b60-996f-4b5c-a6ad-19d987cee70d","Type":"ContainerStarted","Data":"587052c32f6ab8df4f588853b04512c9fc679bd38d4edcea862405d6c89a562c"} Jan 10 17:24:57 crc kubenswrapper[5036]: I0110 17:24:57.508389 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:24:57 crc kubenswrapper[5036]: I0110 17:24:57.543949 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r68tv" event={"ID":"b0146b60-996f-4b5c-a6ad-19d987cee70d","Type":"ContainerStarted","Data":"b91df23abbe8e56dc5098965bd5e80c87250060b82dcaf421030c752e198cf86"} Jan 10 17:24:58 crc kubenswrapper[5036]: I0110 17:24:58.559869 5036 generic.go:334] "Generic (PLEG): container finished" podID="b0146b60-996f-4b5c-a6ad-19d987cee70d" containerID="b91df23abbe8e56dc5098965bd5e80c87250060b82dcaf421030c752e198cf86" exitCode=0 Jan 10 17:24:58 crc kubenswrapper[5036]: I0110 17:24:58.560500 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r68tv" event={"ID":"b0146b60-996f-4b5c-a6ad-19d987cee70d","Type":"ContainerDied","Data":"b91df23abbe8e56dc5098965bd5e80c87250060b82dcaf421030c752e198cf86"} Jan 10 17:24:58 crc kubenswrapper[5036]: I0110 17:24:58.567232 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerStarted","Data":"818dec2ff2a2b1cc25943b2daf0369fb8c17b58d5317d976bd7f432c5df76134"} Jan 10 17:24:59 crc kubenswrapper[5036]: I0110 17:24:59.576380 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r68tv" event={"ID":"b0146b60-996f-4b5c-a6ad-19d987cee70d","Type":"ContainerStarted","Data":"1c2703e2c47da633993755f2b0ae5ff58574d6b2e5916747baf8c51212a0cb82"} Jan 10 17:24:59 crc kubenswrapper[5036]: I0110 17:24:59.606869 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r68tv" podStartSLOduration=3.061034502 podStartE2EDuration="5.606842469s" podCreationTimestamp="2026-01-10 17:24:54 +0000 UTC" firstStartedPulling="2026-01-10 17:24:56.514323347 +0000 UTC m=+3418.384558841" lastFinishedPulling="2026-01-10 17:24:59.060131314 +0000 UTC m=+3420.930366808" observedRunningTime="2026-01-10 17:24:59.600067277 +0000 UTC m=+3421.470302771" watchObservedRunningTime="2026-01-10 17:24:59.606842469 +0000 UTC m=+3421.477077983" Jan 10 17:24:59 crc kubenswrapper[5036]: I0110 17:24:59.696809 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_066ba36b-3da0-4db3-8f19-13e5a5227ab5/memcached/0.log" Jan 10 17:25:05 crc kubenswrapper[5036]: I0110 17:25:05.097871 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r68tv" Jan 10 17:25:05 crc kubenswrapper[5036]: I0110 17:25:05.098595 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r68tv" Jan 10 17:25:05 crc kubenswrapper[5036]: I0110 17:25:05.160071 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r68tv" Jan 10 17:25:05 crc kubenswrapper[5036]: I0110 17:25:05.689921 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r68tv" Jan 10 17:25:05 crc kubenswrapper[5036]: I0110 17:25:05.743494 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r68tv"] Jan 10 17:25:06 crc kubenswrapper[5036]: I0110 17:25:06.793883 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx_b93cb83a-a272-4416-bff9-4da9aeb4f412/util/0.log" Jan 10 17:25:06 crc kubenswrapper[5036]: I0110 17:25:06.793997 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx_b93cb83a-a272-4416-bff9-4da9aeb4f412/util/0.log" Jan 10 17:25:06 crc kubenswrapper[5036]: I0110 17:25:06.794730 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx_b93cb83a-a272-4416-bff9-4da9aeb4f412/pull/0.log" Jan 10 17:25:06 crc kubenswrapper[5036]: I0110 17:25:06.885374 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx_b93cb83a-a272-4416-bff9-4da9aeb4f412/pull/0.log" Jan 10 17:25:07 crc kubenswrapper[5036]: I0110 17:25:07.035032 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx_b93cb83a-a272-4416-bff9-4da9aeb4f412/util/0.log" Jan 10 17:25:07 crc kubenswrapper[5036]: I0110 17:25:07.040668 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx_b93cb83a-a272-4416-bff9-4da9aeb4f412/extract/0.log" Jan 10 17:25:07 crc kubenswrapper[5036]: I0110 17:25:07.057196 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx_b93cb83a-a272-4416-bff9-4da9aeb4f412/pull/0.log" Jan 10 17:25:07 crc kubenswrapper[5036]: I0110 17:25:07.233330 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-678b8c6d96-568pc_a17f3d4e-41a9-4941-83f6-090808b6cb29/manager/0.log" Jan 10 17:25:07 crc kubenswrapper[5036]: I0110 17:25:07.303902 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-78979fc445-2qq47_f1b7f315-826c-4a66-9919-69b3c75a648e/manager/0.log" Jan 10 17:25:07 crc kubenswrapper[5036]: I0110 17:25:07.426346 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66f8b87655-trzdf_52b19fea-05ac-4448-9446-33fbee11b2da/manager/0.log" Jan 10 17:25:07 crc kubenswrapper[5036]: I0110 17:25:07.563998 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5967c8645c-cbdjv_09239a1e-ce39-49e7-a532-f7c353022176/manager/0.log" Jan 10 17:25:07 crc kubenswrapper[5036]: I0110 17:25:07.652943 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r68tv" podUID="b0146b60-996f-4b5c-a6ad-19d987cee70d" containerName="registry-server" containerID="cri-o://1c2703e2c47da633993755f2b0ae5ff58574d6b2e5916747baf8c51212a0cb82" gracePeriod=2 Jan 10 17:25:07 crc kubenswrapper[5036]: I0110 17:25:07.673757 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-65c54c675d-ng9ld_ecf84720-507a-4a26-8326-7ed56754871e/manager/0.log" Jan 10 17:25:07 crc kubenswrapper[5036]: I0110 17:25:07.737398 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7998b4cc7b-bjxnm_58ba757d-493c-4a4c-9aaa-a3178272b7cb/manager/0.log" Jan 10 17:25:07 crc kubenswrapper[5036]: I0110 17:25:07.923321 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5b47c74dd5-skh8x_b4905be6-774a-4952-b195-f755688c7b26/manager/0.log" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.154347 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r68tv" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.192244 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-xcmds_80ddf12b-ee61-4d6f-a3fb-ff9aded793d7/manager/0.log" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.208873 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0146b60-996f-4b5c-a6ad-19d987cee70d-utilities\") pod \"b0146b60-996f-4b5c-a6ad-19d987cee70d\" (UID: \"b0146b60-996f-4b5c-a6ad-19d987cee70d\") " Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.208973 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2cfn\" (UniqueName: \"kubernetes.io/projected/b0146b60-996f-4b5c-a6ad-19d987cee70d-kube-api-access-g2cfn\") pod \"b0146b60-996f-4b5c-a6ad-19d987cee70d\" (UID: \"b0146b60-996f-4b5c-a6ad-19d987cee70d\") " Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.209045 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0146b60-996f-4b5c-a6ad-19d987cee70d-catalog-content\") pod \"b0146b60-996f-4b5c-a6ad-19d987cee70d\" (UID: \"b0146b60-996f-4b5c-a6ad-19d987cee70d\") " Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.210393 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0146b60-996f-4b5c-a6ad-19d987cee70d-utilities" (OuterVolumeSpecName: "utilities") pod "b0146b60-996f-4b5c-a6ad-19d987cee70d" (UID: "b0146b60-996f-4b5c-a6ad-19d987cee70d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.239841 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0146b60-996f-4b5c-a6ad-19d987cee70d-kube-api-access-g2cfn" (OuterVolumeSpecName: "kube-api-access-g2cfn") pod "b0146b60-996f-4b5c-a6ad-19d987cee70d" (UID: "b0146b60-996f-4b5c-a6ad-19d987cee70d"). InnerVolumeSpecName "kube-api-access-g2cfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.255666 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-568985c78-cs2b2_611b3f4f-0b6d-4ef9-b040-eba991c4bfe4/manager/0.log" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.283777 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0146b60-996f-4b5c-a6ad-19d987cee70d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0146b60-996f-4b5c-a6ad-19d987cee70d" (UID: "b0146b60-996f-4b5c-a6ad-19d987cee70d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.311119 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0146b60-996f-4b5c-a6ad-19d987cee70d-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.311161 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2cfn\" (UniqueName: \"kubernetes.io/projected/b0146b60-996f-4b5c-a6ad-19d987cee70d-kube-api-access-g2cfn\") on node \"crc\" DevicePath \"\"" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.311173 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0146b60-996f-4b5c-a6ad-19d987cee70d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.375461 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-598945d5b8-l8ggv_552c1d94-e289-46e0-8756-58982a7cdc4c/manager/0.log" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.418760 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-746ccdd857-kkjhp_0a3b9993-b2fb-4dda-952a-413cd5a3e01a/manager/0.log" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.662371 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7cd87b778f-4t295_6414be0b-ef34-4c95-9e31-4124dcad6cc4/manager/0.log" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.662966 5036 generic.go:334] "Generic (PLEG): container finished" podID="b0146b60-996f-4b5c-a6ad-19d987cee70d" containerID="1c2703e2c47da633993755f2b0ae5ff58574d6b2e5916747baf8c51212a0cb82" exitCode=0 Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.663067 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r68tv" event={"ID":"b0146b60-996f-4b5c-a6ad-19d987cee70d","Type":"ContainerDied","Data":"1c2703e2c47da633993755f2b0ae5ff58574d6b2e5916747baf8c51212a0cb82"} Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.663096 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r68tv" event={"ID":"b0146b60-996f-4b5c-a6ad-19d987cee70d","Type":"ContainerDied","Data":"587052c32f6ab8df4f588853b04512c9fc679bd38d4edcea862405d6c89a562c"} Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.663112 5036 scope.go:117] "RemoveContainer" containerID="1c2703e2c47da633993755f2b0ae5ff58574d6b2e5916747baf8c51212a0cb82" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.663301 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r68tv" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.684038 5036 scope.go:117] "RemoveContainer" containerID="b91df23abbe8e56dc5098965bd5e80c87250060b82dcaf421030c752e198cf86" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.697440 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5fbbf8b6cc-t7qtf_506aa4ca-31bb-48da-94b5-9ab7b43aea96/manager/0.log" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.702739 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r68tv"] Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.703003 5036 scope.go:117] "RemoveContainer" containerID="a45d28669952dd9f4c464d5ad8c1b0650f61aa99c3dc1b1a7ea1dc13e228620e" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.709619 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r68tv"] Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.750006 5036 scope.go:117] "RemoveContainer" containerID="1c2703e2c47da633993755f2b0ae5ff58574d6b2e5916747baf8c51212a0cb82" Jan 10 17:25:08 crc kubenswrapper[5036]: E0110 17:25:08.751007 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c2703e2c47da633993755f2b0ae5ff58574d6b2e5916747baf8c51212a0cb82\": container with ID starting with 1c2703e2c47da633993755f2b0ae5ff58574d6b2e5916747baf8c51212a0cb82 not found: ID does not exist" containerID="1c2703e2c47da633993755f2b0ae5ff58574d6b2e5916747baf8c51212a0cb82" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.751048 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c2703e2c47da633993755f2b0ae5ff58574d6b2e5916747baf8c51212a0cb82"} err="failed to get container status \"1c2703e2c47da633993755f2b0ae5ff58574d6b2e5916747baf8c51212a0cb82\": rpc error: code = NotFound desc = could not find container \"1c2703e2c47da633993755f2b0ae5ff58574d6b2e5916747baf8c51212a0cb82\": container with ID starting with 1c2703e2c47da633993755f2b0ae5ff58574d6b2e5916747baf8c51212a0cb82 not found: ID does not exist" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.751072 5036 scope.go:117] "RemoveContainer" containerID="b91df23abbe8e56dc5098965bd5e80c87250060b82dcaf421030c752e198cf86" Jan 10 17:25:08 crc kubenswrapper[5036]: E0110 17:25:08.751307 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b91df23abbe8e56dc5098965bd5e80c87250060b82dcaf421030c752e198cf86\": container with ID starting with b91df23abbe8e56dc5098965bd5e80c87250060b82dcaf421030c752e198cf86 not found: ID does not exist" containerID="b91df23abbe8e56dc5098965bd5e80c87250060b82dcaf421030c752e198cf86" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.751326 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b91df23abbe8e56dc5098965bd5e80c87250060b82dcaf421030c752e198cf86"} err="failed to get container status \"b91df23abbe8e56dc5098965bd5e80c87250060b82dcaf421030c752e198cf86\": rpc error: code = NotFound desc = could not find container \"b91df23abbe8e56dc5098965bd5e80c87250060b82dcaf421030c752e198cf86\": container with ID starting with b91df23abbe8e56dc5098965bd5e80c87250060b82dcaf421030c752e198cf86 not found: ID does not exist" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.751338 5036 scope.go:117] "RemoveContainer" containerID="a45d28669952dd9f4c464d5ad8c1b0650f61aa99c3dc1b1a7ea1dc13e228620e" Jan 10 17:25:08 crc kubenswrapper[5036]: E0110 17:25:08.751491 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a45d28669952dd9f4c464d5ad8c1b0650f61aa99c3dc1b1a7ea1dc13e228620e\": container with ID starting with a45d28669952dd9f4c464d5ad8c1b0650f61aa99c3dc1b1a7ea1dc13e228620e not found: ID does not exist" containerID="a45d28669952dd9f4c464d5ad8c1b0650f61aa99c3dc1b1a7ea1dc13e228620e" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.751508 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a45d28669952dd9f4c464d5ad8c1b0650f61aa99c3dc1b1a7ea1dc13e228620e"} err="failed to get container status \"a45d28669952dd9f4c464d5ad8c1b0650f61aa99c3dc1b1a7ea1dc13e228620e\": rpc error: code = NotFound desc = could not find container \"a45d28669952dd9f4c464d5ad8c1b0650f61aa99c3dc1b1a7ea1dc13e228620e\": container with ID starting with a45d28669952dd9f4c464d5ad8c1b0650f61aa99c3dc1b1a7ea1dc13e228620e not found: ID does not exist" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.850113 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-68c649d9d-wv445_254c9f2b-ef77-4fbc-9884-c14caa297876/manager/0.log" Jan 10 17:25:08 crc kubenswrapper[5036]: I0110 17:25:08.869314 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5b4889549f2j7sh_f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1/manager/0.log" Jan 10 17:25:09 crc kubenswrapper[5036]: I0110 17:25:09.294698 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-sdc2j_3929902d-323d-44ec-84be-4069e262618f/registry-server/0.log" Jan 10 17:25:09 crc kubenswrapper[5036]: I0110 17:25:09.307996 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5d4cd6578d-pt5gl_0ab4dccd-a4ff-49f6-96bf-a7150425ff15/operator/0.log" Jan 10 17:25:09 crc kubenswrapper[5036]: I0110 17:25:09.570733 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-84587ffc8-l7b7s_6283e4f6-c60e-4bff-b622-181c4abbc8a6/manager/0.log" Jan 10 17:25:09 crc kubenswrapper[5036]: I0110 17:25:09.600434 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bf6d4f946-zz7v2_cf6aa765-9fbf-429d-83c1-db4671e7600c/manager/0.log" Jan 10 17:25:09 crc kubenswrapper[5036]: I0110 17:25:09.831952 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-gcjhz_4ddc3dbc-f7b1-4627-9740-9e2f5c0296fd/operator/0.log" Jan 10 17:25:09 crc kubenswrapper[5036]: I0110 17:25:09.988917 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bb586bbf4-tn7cg_2e9ebb80-028a-43ac-b9cb-379dd1eda24e/manager/0.log" Jan 10 17:25:10 crc kubenswrapper[5036]: I0110 17:25:10.173292 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-68d988df55-88zlb_2c21d679-225e-4c33-8920-06a85ae163b6/manager/0.log" Jan 10 17:25:10 crc kubenswrapper[5036]: I0110 17:25:10.180234 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-56458c9ddd-p4bsn_de8e8f66-6d85-43d5-94a4-613fb3bfc53b/manager/0.log" Jan 10 17:25:10 crc kubenswrapper[5036]: I0110 17:25:10.372146 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6c866cfdcb-nbdkb_f3046ad8-aadd-4883-82b9-a794ddce82b9/manager/0.log" Jan 10 17:25:10 crc kubenswrapper[5036]: I0110 17:25:10.416547 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-9dbdf6486-fwmft_7d8099e2-6cd1-4ce8-b78b-0b51a4fedf42/manager/0.log" Jan 10 17:25:10 crc kubenswrapper[5036]: I0110 17:25:10.522003 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0146b60-996f-4b5c-a6ad-19d987cee70d" path="/var/lib/kubelet/pods/b0146b60-996f-4b5c-a6ad-19d987cee70d/volumes" Jan 10 17:25:30 crc kubenswrapper[5036]: I0110 17:25:30.064303 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-ptztt_acb54813-4c4d-4b94-9337-19541ac1980e/control-plane-machine-set-operator/0.log" Jan 10 17:25:30 crc kubenswrapper[5036]: I0110 17:25:30.171148 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-45j5v_6b14e5d5-1b40-45f6-a5c6-c161eeade0f9/kube-rbac-proxy/0.log" Jan 10 17:25:30 crc kubenswrapper[5036]: I0110 17:25:30.230247 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-45j5v_6b14e5d5-1b40-45f6-a5c6-c161eeade0f9/machine-api-operator/0.log" Jan 10 17:25:44 crc kubenswrapper[5036]: I0110 17:25:44.887578 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-hrdzf_ef12a866-7983-4859-8d00-6ba6ed292af3/cert-manager-controller/0.log" Jan 10 17:25:45 crc kubenswrapper[5036]: I0110 17:25:45.024706 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-b7rxm_2574f8f4-e56e-4d7e-b181-5e01d69b1485/cert-manager-cainjector/0.log" Jan 10 17:25:45 crc kubenswrapper[5036]: I0110 17:25:45.084719 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-pgpxj_b15af209-c459-40f3-affc-0d5a3d2b031d/cert-manager-webhook/0.log" Jan 10 17:25:58 crc kubenswrapper[5036]: I0110 17:25:58.502960 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6ff7998486-577fv_31225a12-4366-4224-9bb5-3c8ee635a631/nmstate-console-plugin/0.log" Jan 10 17:25:58 crc kubenswrapper[5036]: I0110 17:25:58.661221 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-7r9qs_5abcf259-63b1-44b5-b335-950b101edec4/nmstate-handler/0.log" Jan 10 17:25:58 crc kubenswrapper[5036]: I0110 17:25:58.714292 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-f826c_d921a9df-835d-4165-ac39-8717cfcf384d/kube-rbac-proxy/0.log" Jan 10 17:25:58 crc kubenswrapper[5036]: I0110 17:25:58.719005 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-f826c_d921a9df-835d-4165-ac39-8717cfcf384d/nmstate-metrics/0.log" Jan 10 17:25:59 crc kubenswrapper[5036]: I0110 17:25:59.066756 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-6769fb99d-bnf7g_d8791ab9-ee3b-4af7-98d5-2bc06f5d863a/nmstate-operator/0.log" Jan 10 17:25:59 crc kubenswrapper[5036]: I0110 17:25:59.086745 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-f8fb84555-cdrlk_e0f63dbf-f65f-4d9a-8cf4-802a41ed012b/nmstate-webhook/0.log" Jan 10 17:26:09 crc kubenswrapper[5036]: I0110 17:26:09.051751 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-zfb2d"] Jan 10 17:26:09 crc kubenswrapper[5036]: I0110 17:26:09.060301 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-97e9-account-create-update-c6fxd"] Jan 10 17:26:09 crc kubenswrapper[5036]: I0110 17:26:09.075600 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-97e9-account-create-update-c6fxd"] Jan 10 17:26:09 crc kubenswrapper[5036]: I0110 17:26:09.084743 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-zfb2d"] Jan 10 17:26:10 crc kubenswrapper[5036]: I0110 17:26:10.522606 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01a3d231-ccaa-462b-a57b-b56b4e0f2921" path="/var/lib/kubelet/pods/01a3d231-ccaa-462b-a57b-b56b4e0f2921/volumes" Jan 10 17:26:10 crc kubenswrapper[5036]: I0110 17:26:10.523664 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d460130e-a99b-46ab-b4d5-fa9528b70515" path="/var/lib/kubelet/pods/d460130e-a99b-46ab-b4d5-fa9528b70515/volumes" Jan 10 17:26:24 crc kubenswrapper[5036]: I0110 17:26:24.895549 5036 scope.go:117] "RemoveContainer" containerID="5fd8730449515662f761080759c647c5bf24d32a8338fc12b78ae59e23c1234e" Jan 10 17:26:24 crc kubenswrapper[5036]: I0110 17:26:24.939532 5036 scope.go:117] "RemoveContainer" containerID="dfda14caa23bfac8209039b597c8208d37117f50a8a51807d99ab3470254e9b6" Jan 10 17:26:28 crc kubenswrapper[5036]: I0110 17:26:28.317313 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-5xsxl_fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69/kube-rbac-proxy/0.log" Jan 10 17:26:28 crc kubenswrapper[5036]: I0110 17:26:28.372056 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-5xsxl_fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69/controller/0.log" Jan 10 17:26:28 crc kubenswrapper[5036]: I0110 17:26:28.515000 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/cp-frr-files/0.log" Jan 10 17:26:28 crc kubenswrapper[5036]: I0110 17:26:28.682810 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/cp-reloader/0.log" Jan 10 17:26:28 crc kubenswrapper[5036]: I0110 17:26:28.779225 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/cp-frr-files/0.log" Jan 10 17:26:28 crc kubenswrapper[5036]: I0110 17:26:28.796661 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/cp-reloader/0.log" Jan 10 17:26:28 crc kubenswrapper[5036]: I0110 17:26:28.832399 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/cp-metrics/0.log" Jan 10 17:26:29 crc kubenswrapper[5036]: I0110 17:26:29.007488 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/cp-reloader/0.log" Jan 10 17:26:29 crc kubenswrapper[5036]: I0110 17:26:29.021583 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/cp-frr-files/0.log" Jan 10 17:26:29 crc kubenswrapper[5036]: I0110 17:26:29.057039 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/cp-metrics/0.log" Jan 10 17:26:29 crc kubenswrapper[5036]: I0110 17:26:29.057154 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/cp-metrics/0.log" Jan 10 17:26:29 crc kubenswrapper[5036]: I0110 17:26:29.227660 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/cp-reloader/0.log" Jan 10 17:26:29 crc kubenswrapper[5036]: I0110 17:26:29.234246 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/cp-metrics/0.log" Jan 10 17:26:29 crc kubenswrapper[5036]: I0110 17:26:29.282867 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/controller/0.log" Jan 10 17:26:29 crc kubenswrapper[5036]: I0110 17:26:29.303663 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/cp-frr-files/0.log" Jan 10 17:26:29 crc kubenswrapper[5036]: I0110 17:26:29.457117 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/kube-rbac-proxy-frr/0.log" Jan 10 17:26:29 crc kubenswrapper[5036]: I0110 17:26:29.465139 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/frr-metrics/0.log" Jan 10 17:26:29 crc kubenswrapper[5036]: I0110 17:26:29.510231 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/kube-rbac-proxy/0.log" Jan 10 17:26:29 crc kubenswrapper[5036]: I0110 17:26:29.638317 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/reloader/0.log" Jan 10 17:26:29 crc kubenswrapper[5036]: I0110 17:26:29.730119 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7784b6fcf-qzgnv_24cab86b-603a-48b4-9b8f-add5e9a79f7b/frr-k8s-webhook-server/0.log" Jan 10 17:26:30 crc kubenswrapper[5036]: I0110 17:26:30.539866 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-86bf866985-6ggxt_12be6fd4-c97c-439e-8a06-3769f37d7b48/webhook-server/0.log" Jan 10 17:26:30 crc kubenswrapper[5036]: I0110 17:26:30.589364 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-57fdf6dfbb-rvjhl_27d72d19-58d6-4094-8d3a-826354e6bb02/manager/0.log" Jan 10 17:26:30 crc kubenswrapper[5036]: I0110 17:26:30.842003 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bxfjm_6d60af08-1ea0-49e4-aa55-8f9bfa63b34b/kube-rbac-proxy/0.log" Jan 10 17:26:31 crc kubenswrapper[5036]: I0110 17:26:31.294477 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bxfjm_6d60af08-1ea0-49e4-aa55-8f9bfa63b34b/speaker/0.log" Jan 10 17:26:31 crc kubenswrapper[5036]: I0110 17:26:31.598777 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/frr/0.log" Jan 10 17:26:39 crc kubenswrapper[5036]: I0110 17:26:39.070719 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-lk5kh"] Jan 10 17:26:39 crc kubenswrapper[5036]: I0110 17:26:39.088858 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-lk5kh"] Jan 10 17:26:40 crc kubenswrapper[5036]: I0110 17:26:40.520514 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9d964e6-ac20-4cac-ad16-6461bc88fac7" path="/var/lib/kubelet/pods/e9d964e6-ac20-4cac-ad16-6461bc88fac7/volumes" Jan 10 17:26:45 crc kubenswrapper[5036]: I0110 17:26:45.378170 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms_56b82ef1-8690-4ba9-9ebe-1ce6b933df2b/util/0.log" Jan 10 17:26:45 crc kubenswrapper[5036]: I0110 17:26:45.571751 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms_56b82ef1-8690-4ba9-9ebe-1ce6b933df2b/util/0.log" Jan 10 17:26:45 crc kubenswrapper[5036]: I0110 17:26:45.578358 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms_56b82ef1-8690-4ba9-9ebe-1ce6b933df2b/pull/0.log" Jan 10 17:26:45 crc kubenswrapper[5036]: I0110 17:26:45.607257 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms_56b82ef1-8690-4ba9-9ebe-1ce6b933df2b/pull/0.log" Jan 10 17:26:45 crc kubenswrapper[5036]: I0110 17:26:45.725655 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms_56b82ef1-8690-4ba9-9ebe-1ce6b933df2b/util/0.log" Jan 10 17:26:45 crc kubenswrapper[5036]: I0110 17:26:45.758085 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms_56b82ef1-8690-4ba9-9ebe-1ce6b933df2b/pull/0.log" Jan 10 17:26:45 crc kubenswrapper[5036]: I0110 17:26:45.782480 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms_56b82ef1-8690-4ba9-9ebe-1ce6b933df2b/extract/0.log" Jan 10 17:26:45 crc kubenswrapper[5036]: I0110 17:26:45.927335 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4_ccb8fe79-0985-4f47-9885-cb6561c44e59/util/0.log" Jan 10 17:26:46 crc kubenswrapper[5036]: I0110 17:26:46.087928 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4_ccb8fe79-0985-4f47-9885-cb6561c44e59/util/0.log" Jan 10 17:26:46 crc kubenswrapper[5036]: I0110 17:26:46.088798 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4_ccb8fe79-0985-4f47-9885-cb6561c44e59/pull/0.log" Jan 10 17:26:46 crc kubenswrapper[5036]: I0110 17:26:46.125377 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4_ccb8fe79-0985-4f47-9885-cb6561c44e59/pull/0.log" Jan 10 17:26:46 crc kubenswrapper[5036]: I0110 17:26:46.283137 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4_ccb8fe79-0985-4f47-9885-cb6561c44e59/pull/0.log" Jan 10 17:26:46 crc kubenswrapper[5036]: I0110 17:26:46.323540 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4_ccb8fe79-0985-4f47-9885-cb6561c44e59/extract/0.log" Jan 10 17:26:46 crc kubenswrapper[5036]: I0110 17:26:46.345406 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4_ccb8fe79-0985-4f47-9885-cb6561c44e59/util/0.log" Jan 10 17:26:46 crc kubenswrapper[5036]: I0110 17:26:46.467713 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nfqrz_3195b346-8a73-4e01-9842-5a7fde228f6e/extract-utilities/0.log" Jan 10 17:26:46 crc kubenswrapper[5036]: I0110 17:26:46.646077 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nfqrz_3195b346-8a73-4e01-9842-5a7fde228f6e/extract-utilities/0.log" Jan 10 17:26:46 crc kubenswrapper[5036]: I0110 17:26:46.662790 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nfqrz_3195b346-8a73-4e01-9842-5a7fde228f6e/extract-content/0.log" Jan 10 17:26:46 crc kubenswrapper[5036]: I0110 17:26:46.697417 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nfqrz_3195b346-8a73-4e01-9842-5a7fde228f6e/extract-content/0.log" Jan 10 17:26:46 crc kubenswrapper[5036]: I0110 17:26:46.791316 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nfqrz_3195b346-8a73-4e01-9842-5a7fde228f6e/extract-content/0.log" Jan 10 17:26:46 crc kubenswrapper[5036]: I0110 17:26:46.815057 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nfqrz_3195b346-8a73-4e01-9842-5a7fde228f6e/extract-utilities/0.log" Jan 10 17:26:47 crc kubenswrapper[5036]: I0110 17:26:47.011156 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khg2q_d18f574b-ac33-4b60-bcbc-856b463b231a/extract-utilities/0.log" Jan 10 17:26:47 crc kubenswrapper[5036]: I0110 17:26:47.172849 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nfqrz_3195b346-8a73-4e01-9842-5a7fde228f6e/registry-server/0.log" Jan 10 17:26:47 crc kubenswrapper[5036]: I0110 17:26:47.225115 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khg2q_d18f574b-ac33-4b60-bcbc-856b463b231a/extract-utilities/0.log" Jan 10 17:26:47 crc kubenswrapper[5036]: I0110 17:26:47.276587 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khg2q_d18f574b-ac33-4b60-bcbc-856b463b231a/extract-content/0.log" Jan 10 17:26:47 crc kubenswrapper[5036]: I0110 17:26:47.281270 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khg2q_d18f574b-ac33-4b60-bcbc-856b463b231a/extract-content/0.log" Jan 10 17:26:47 crc kubenswrapper[5036]: I0110 17:26:47.423133 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khg2q_d18f574b-ac33-4b60-bcbc-856b463b231a/extract-content/0.log" Jan 10 17:26:47 crc kubenswrapper[5036]: I0110 17:26:47.438516 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khg2q_d18f574b-ac33-4b60-bcbc-856b463b231a/extract-utilities/0.log" Jan 10 17:26:47 crc kubenswrapper[5036]: I0110 17:26:47.589372 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-gm65z_d8de44e3-ed07-4c76-8aa8-2265c9cd1805/marketplace-operator/0.log" Jan 10 17:26:47 crc kubenswrapper[5036]: I0110 17:26:47.712994 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-28lcz_dd0fc5aa-292a-4009-8cdf-0534293491f3/extract-utilities/0.log" Jan 10 17:26:47 crc kubenswrapper[5036]: I0110 17:26:47.907723 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-28lcz_dd0fc5aa-292a-4009-8cdf-0534293491f3/extract-content/0.log" Jan 10 17:26:47 crc kubenswrapper[5036]: I0110 17:26:47.930472 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-28lcz_dd0fc5aa-292a-4009-8cdf-0534293491f3/extract-utilities/0.log" Jan 10 17:26:47 crc kubenswrapper[5036]: I0110 17:26:47.964272 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khg2q_d18f574b-ac33-4b60-bcbc-856b463b231a/registry-server/0.log" Jan 10 17:26:47 crc kubenswrapper[5036]: I0110 17:26:47.978343 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-28lcz_dd0fc5aa-292a-4009-8cdf-0534293491f3/extract-content/0.log" Jan 10 17:26:48 crc kubenswrapper[5036]: I0110 17:26:48.130031 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-28lcz_dd0fc5aa-292a-4009-8cdf-0534293491f3/extract-utilities/0.log" Jan 10 17:26:48 crc kubenswrapper[5036]: I0110 17:26:48.131760 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-28lcz_dd0fc5aa-292a-4009-8cdf-0534293491f3/extract-content/0.log" Jan 10 17:26:48 crc kubenswrapper[5036]: I0110 17:26:48.227710 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-28lcz_dd0fc5aa-292a-4009-8cdf-0534293491f3/registry-server/0.log" Jan 10 17:26:48 crc kubenswrapper[5036]: I0110 17:26:48.336400 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vdpgl_91cf1499-408e-4bc3-b3ae-8f435079b904/extract-utilities/0.log" Jan 10 17:26:48 crc kubenswrapper[5036]: I0110 17:26:48.497415 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vdpgl_91cf1499-408e-4bc3-b3ae-8f435079b904/extract-content/0.log" Jan 10 17:26:48 crc kubenswrapper[5036]: I0110 17:26:48.513766 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vdpgl_91cf1499-408e-4bc3-b3ae-8f435079b904/extract-utilities/0.log" Jan 10 17:26:48 crc kubenswrapper[5036]: I0110 17:26:48.530866 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vdpgl_91cf1499-408e-4bc3-b3ae-8f435079b904/extract-content/0.log" Jan 10 17:26:48 crc kubenswrapper[5036]: I0110 17:26:48.690095 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vdpgl_91cf1499-408e-4bc3-b3ae-8f435079b904/extract-utilities/0.log" Jan 10 17:26:48 crc kubenswrapper[5036]: I0110 17:26:48.779116 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vdpgl_91cf1499-408e-4bc3-b3ae-8f435079b904/extract-content/0.log" Jan 10 17:26:49 crc kubenswrapper[5036]: I0110 17:26:49.188334 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vdpgl_91cf1499-408e-4bc3-b3ae-8f435079b904/registry-server/0.log" Jan 10 17:27:10 crc kubenswrapper[5036]: E0110 17:27:10.587096 5036 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.83:48782->38.102.83.83:37657: read tcp 38.102.83.83:48782->38.102.83.83:37657: read: connection reset by peer Jan 10 17:27:10 crc kubenswrapper[5036]: E0110 17:27:10.587665 5036 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.83:48782->38.102.83.83:37657: write tcp 38.102.83.83:48782->38.102.83.83:37657: write: broken pipe Jan 10 17:27:25 crc kubenswrapper[5036]: I0110 17:27:25.053970 5036 scope.go:117] "RemoveContainer" containerID="acec2eef33e20f5517bc0a09686c6e1d11091dc40b735e07d9bfa39c8e63a6d9" Jan 10 17:27:25 crc kubenswrapper[5036]: I0110 17:27:25.907391 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 17:27:25 crc kubenswrapper[5036]: I0110 17:27:25.907807 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 17:27:55 crc kubenswrapper[5036]: I0110 17:27:55.904296 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 17:27:55 crc kubenswrapper[5036]: I0110 17:27:55.905196 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 17:28:25 crc kubenswrapper[5036]: I0110 17:28:25.903854 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 17:28:25 crc kubenswrapper[5036]: I0110 17:28:25.904320 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 17:28:25 crc kubenswrapper[5036]: I0110 17:28:25.904360 5036 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 17:28:25 crc kubenswrapper[5036]: I0110 17:28:25.905015 5036 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"818dec2ff2a2b1cc25943b2daf0369fb8c17b58d5317d976bd7f432c5df76134"} pod="openshift-machine-config-operator/machine-config-daemon-kqphb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 10 17:28:25 crc kubenswrapper[5036]: I0110 17:28:25.905057 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" containerID="cri-o://818dec2ff2a2b1cc25943b2daf0369fb8c17b58d5317d976bd7f432c5df76134" gracePeriod=600 Jan 10 17:28:27 crc kubenswrapper[5036]: I0110 17:28:27.024158 5036 generic.go:334] "Generic (PLEG): container finished" podID="79756361-741e-4470-831b-6ee092bc6277" containerID="818dec2ff2a2b1cc25943b2daf0369fb8c17b58d5317d976bd7f432c5df76134" exitCode=0 Jan 10 17:28:27 crc kubenswrapper[5036]: I0110 17:28:27.024266 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerDied","Data":"818dec2ff2a2b1cc25943b2daf0369fb8c17b58d5317d976bd7f432c5df76134"} Jan 10 17:28:27 crc kubenswrapper[5036]: I0110 17:28:27.024782 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerStarted","Data":"2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e"} Jan 10 17:28:27 crc kubenswrapper[5036]: I0110 17:28:27.024813 5036 scope.go:117] "RemoveContainer" containerID="7e60cfdd4120f63892e58d95243b59d8c26446827e0746f1dcef638d2b4e9dc4" Jan 10 17:28:33 crc kubenswrapper[5036]: I0110 17:28:33.110821 5036 generic.go:334] "Generic (PLEG): container finished" podID="e020151a-987f-4113-8b01-ea7b8c426757" containerID="6747abe413de75eb2a719869d77425fae684a08abebc747aac7b123f41f1a2ee" exitCode=0 Jan 10 17:28:33 crc kubenswrapper[5036]: I0110 17:28:33.110915 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kdbxc/must-gather-skwkm" event={"ID":"e020151a-987f-4113-8b01-ea7b8c426757","Type":"ContainerDied","Data":"6747abe413de75eb2a719869d77425fae684a08abebc747aac7b123f41f1a2ee"} Jan 10 17:28:33 crc kubenswrapper[5036]: I0110 17:28:33.112280 5036 scope.go:117] "RemoveContainer" containerID="6747abe413de75eb2a719869d77425fae684a08abebc747aac7b123f41f1a2ee" Jan 10 17:28:33 crc kubenswrapper[5036]: I0110 17:28:33.304583 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kdbxc_must-gather-skwkm_e020151a-987f-4113-8b01-ea7b8c426757/gather/0.log" Jan 10 17:28:40 crc kubenswrapper[5036]: I0110 17:28:40.526562 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kdbxc/must-gather-skwkm"] Jan 10 17:28:40 crc kubenswrapper[5036]: I0110 17:28:40.527601 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-kdbxc/must-gather-skwkm" podUID="e020151a-987f-4113-8b01-ea7b8c426757" containerName="copy" containerID="cri-o://7d82dd1760bdf87a728147e0dc6f87fe82dbd93a6d2afba36f68be8def68f608" gracePeriod=2 Jan 10 17:28:40 crc kubenswrapper[5036]: I0110 17:28:40.545596 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kdbxc/must-gather-skwkm"] Jan 10 17:28:40 crc kubenswrapper[5036]: I0110 17:28:40.940549 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kdbxc_must-gather-skwkm_e020151a-987f-4113-8b01-ea7b8c426757/copy/0.log" Jan 10 17:28:40 crc kubenswrapper[5036]: I0110 17:28:40.941296 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdbxc/must-gather-skwkm" Jan 10 17:28:41 crc kubenswrapper[5036]: I0110 17:28:41.090698 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxbd7\" (UniqueName: \"kubernetes.io/projected/e020151a-987f-4113-8b01-ea7b8c426757-kube-api-access-hxbd7\") pod \"e020151a-987f-4113-8b01-ea7b8c426757\" (UID: \"e020151a-987f-4113-8b01-ea7b8c426757\") " Jan 10 17:28:41 crc kubenswrapper[5036]: I0110 17:28:41.090745 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e020151a-987f-4113-8b01-ea7b8c426757-must-gather-output\") pod \"e020151a-987f-4113-8b01-ea7b8c426757\" (UID: \"e020151a-987f-4113-8b01-ea7b8c426757\") " Jan 10 17:28:41 crc kubenswrapper[5036]: I0110 17:28:41.096596 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e020151a-987f-4113-8b01-ea7b8c426757-kube-api-access-hxbd7" (OuterVolumeSpecName: "kube-api-access-hxbd7") pod "e020151a-987f-4113-8b01-ea7b8c426757" (UID: "e020151a-987f-4113-8b01-ea7b8c426757"). InnerVolumeSpecName "kube-api-access-hxbd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:28:41 crc kubenswrapper[5036]: I0110 17:28:41.193213 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxbd7\" (UniqueName: \"kubernetes.io/projected/e020151a-987f-4113-8b01-ea7b8c426757-kube-api-access-hxbd7\") on node \"crc\" DevicePath \"\"" Jan 10 17:28:41 crc kubenswrapper[5036]: I0110 17:28:41.195369 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kdbxc_must-gather-skwkm_e020151a-987f-4113-8b01-ea7b8c426757/copy/0.log" Jan 10 17:28:41 crc kubenswrapper[5036]: I0110 17:28:41.195803 5036 generic.go:334] "Generic (PLEG): container finished" podID="e020151a-987f-4113-8b01-ea7b8c426757" containerID="7d82dd1760bdf87a728147e0dc6f87fe82dbd93a6d2afba36f68be8def68f608" exitCode=143 Jan 10 17:28:41 crc kubenswrapper[5036]: I0110 17:28:41.195940 5036 scope.go:117] "RemoveContainer" containerID="7d82dd1760bdf87a728147e0dc6f87fe82dbd93a6d2afba36f68be8def68f608" Jan 10 17:28:41 crc kubenswrapper[5036]: I0110 17:28:41.196180 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdbxc/must-gather-skwkm" Jan 10 17:28:41 crc kubenswrapper[5036]: I0110 17:28:41.214007 5036 scope.go:117] "RemoveContainer" containerID="6747abe413de75eb2a719869d77425fae684a08abebc747aac7b123f41f1a2ee" Jan 10 17:28:41 crc kubenswrapper[5036]: I0110 17:28:41.242579 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e020151a-987f-4113-8b01-ea7b8c426757-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e020151a-987f-4113-8b01-ea7b8c426757" (UID: "e020151a-987f-4113-8b01-ea7b8c426757"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:28:41 crc kubenswrapper[5036]: I0110 17:28:41.274596 5036 scope.go:117] "RemoveContainer" containerID="7d82dd1760bdf87a728147e0dc6f87fe82dbd93a6d2afba36f68be8def68f608" Jan 10 17:28:41 crc kubenswrapper[5036]: E0110 17:28:41.275055 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d82dd1760bdf87a728147e0dc6f87fe82dbd93a6d2afba36f68be8def68f608\": container with ID starting with 7d82dd1760bdf87a728147e0dc6f87fe82dbd93a6d2afba36f68be8def68f608 not found: ID does not exist" containerID="7d82dd1760bdf87a728147e0dc6f87fe82dbd93a6d2afba36f68be8def68f608" Jan 10 17:28:41 crc kubenswrapper[5036]: I0110 17:28:41.275118 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d82dd1760bdf87a728147e0dc6f87fe82dbd93a6d2afba36f68be8def68f608"} err="failed to get container status \"7d82dd1760bdf87a728147e0dc6f87fe82dbd93a6d2afba36f68be8def68f608\": rpc error: code = NotFound desc = could not find container \"7d82dd1760bdf87a728147e0dc6f87fe82dbd93a6d2afba36f68be8def68f608\": container with ID starting with 7d82dd1760bdf87a728147e0dc6f87fe82dbd93a6d2afba36f68be8def68f608 not found: ID does not exist" Jan 10 17:28:41 crc kubenswrapper[5036]: I0110 17:28:41.276006 5036 scope.go:117] "RemoveContainer" containerID="6747abe413de75eb2a719869d77425fae684a08abebc747aac7b123f41f1a2ee" Jan 10 17:28:41 crc kubenswrapper[5036]: E0110 17:28:41.276577 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6747abe413de75eb2a719869d77425fae684a08abebc747aac7b123f41f1a2ee\": container with ID starting with 6747abe413de75eb2a719869d77425fae684a08abebc747aac7b123f41f1a2ee not found: ID does not exist" containerID="6747abe413de75eb2a719869d77425fae684a08abebc747aac7b123f41f1a2ee" Jan 10 17:28:41 crc kubenswrapper[5036]: I0110 17:28:41.276611 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6747abe413de75eb2a719869d77425fae684a08abebc747aac7b123f41f1a2ee"} err="failed to get container status \"6747abe413de75eb2a719869d77425fae684a08abebc747aac7b123f41f1a2ee\": rpc error: code = NotFound desc = could not find container \"6747abe413de75eb2a719869d77425fae684a08abebc747aac7b123f41f1a2ee\": container with ID starting with 6747abe413de75eb2a719869d77425fae684a08abebc747aac7b123f41f1a2ee not found: ID does not exist" Jan 10 17:28:41 crc kubenswrapper[5036]: I0110 17:28:41.295235 5036 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e020151a-987f-4113-8b01-ea7b8c426757-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 10 17:28:42 crc kubenswrapper[5036]: I0110 17:28:42.524000 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e020151a-987f-4113-8b01-ea7b8c426757" path="/var/lib/kubelet/pods/e020151a-987f-4113-8b01-ea7b8c426757/volumes" Jan 10 17:29:25 crc kubenswrapper[5036]: I0110 17:29:25.145724 5036 scope.go:117] "RemoveContainer" containerID="bffe98affa727d66ad1f45903a677bb512bc7e30dc73302bfd19af92497e31bb" Jan 10 17:30:00 crc kubenswrapper[5036]: I0110 17:30:00.205750 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29467770-2hplx"] Jan 10 17:30:00 crc kubenswrapper[5036]: E0110 17:30:00.206979 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e020151a-987f-4113-8b01-ea7b8c426757" containerName="copy" Jan 10 17:30:00 crc kubenswrapper[5036]: I0110 17:30:00.207002 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="e020151a-987f-4113-8b01-ea7b8c426757" containerName="copy" Jan 10 17:30:00 crc kubenswrapper[5036]: E0110 17:30:00.207066 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0146b60-996f-4b5c-a6ad-19d987cee70d" containerName="registry-server" Jan 10 17:30:00 crc kubenswrapper[5036]: I0110 17:30:00.207081 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0146b60-996f-4b5c-a6ad-19d987cee70d" containerName="registry-server" Jan 10 17:30:00 crc kubenswrapper[5036]: E0110 17:30:00.207133 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e020151a-987f-4113-8b01-ea7b8c426757" containerName="gather" Jan 10 17:30:00 crc kubenswrapper[5036]: I0110 17:30:00.207147 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="e020151a-987f-4113-8b01-ea7b8c426757" containerName="gather" Jan 10 17:30:00 crc kubenswrapper[5036]: E0110 17:30:00.207165 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0146b60-996f-4b5c-a6ad-19d987cee70d" containerName="extract-utilities" Jan 10 17:30:00 crc kubenswrapper[5036]: I0110 17:30:00.207178 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0146b60-996f-4b5c-a6ad-19d987cee70d" containerName="extract-utilities" Jan 10 17:30:00 crc kubenswrapper[5036]: E0110 17:30:00.207194 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0146b60-996f-4b5c-a6ad-19d987cee70d" containerName="extract-content" Jan 10 17:30:00 crc kubenswrapper[5036]: I0110 17:30:00.207207 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0146b60-996f-4b5c-a6ad-19d987cee70d" containerName="extract-content" Jan 10 17:30:00 crc kubenswrapper[5036]: I0110 17:30:00.207523 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="e020151a-987f-4113-8b01-ea7b8c426757" containerName="gather" Jan 10 17:30:00 crc kubenswrapper[5036]: I0110 17:30:00.207563 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0146b60-996f-4b5c-a6ad-19d987cee70d" containerName="registry-server" Jan 10 17:30:00 crc kubenswrapper[5036]: I0110 17:30:00.207589 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="e020151a-987f-4113-8b01-ea7b8c426757" containerName="copy" Jan 10 17:30:00 crc kubenswrapper[5036]: I0110 17:30:00.208609 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29467770-2hplx" Jan 10 17:30:00 crc kubenswrapper[5036]: I0110 17:30:00.214531 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29467770-2hplx"] Jan 10 17:30:00 crc kubenswrapper[5036]: I0110 17:30:00.243384 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 10 17:30:00 crc kubenswrapper[5036]: I0110 17:30:00.243653 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 10 17:30:00 crc kubenswrapper[5036]: I0110 17:30:00.363543 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03f989e6-7b3a-4a45-ae07-e32c73679277-config-volume\") pod \"collect-profiles-29467770-2hplx\" (UID: \"03f989e6-7b3a-4a45-ae07-e32c73679277\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467770-2hplx" Jan 10 17:30:00 crc kubenswrapper[5036]: I0110 17:30:00.363861 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03f989e6-7b3a-4a45-ae07-e32c73679277-secret-volume\") pod \"collect-profiles-29467770-2hplx\" (UID: \"03f989e6-7b3a-4a45-ae07-e32c73679277\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467770-2hplx" Jan 10 17:30:00 crc kubenswrapper[5036]: I0110 17:30:00.364546 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snb4r\" (UniqueName: \"kubernetes.io/projected/03f989e6-7b3a-4a45-ae07-e32c73679277-kube-api-access-snb4r\") pod \"collect-profiles-29467770-2hplx\" (UID: \"03f989e6-7b3a-4a45-ae07-e32c73679277\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467770-2hplx" Jan 10 17:30:00 crc kubenswrapper[5036]: I0110 17:30:00.466578 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03f989e6-7b3a-4a45-ae07-e32c73679277-config-volume\") pod \"collect-profiles-29467770-2hplx\" (UID: \"03f989e6-7b3a-4a45-ae07-e32c73679277\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467770-2hplx" Jan 10 17:30:00 crc kubenswrapper[5036]: I0110 17:30:00.466631 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03f989e6-7b3a-4a45-ae07-e32c73679277-secret-volume\") pod \"collect-profiles-29467770-2hplx\" (UID: \"03f989e6-7b3a-4a45-ae07-e32c73679277\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467770-2hplx" Jan 10 17:30:00 crc kubenswrapper[5036]: I0110 17:30:00.466709 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snb4r\" (UniqueName: \"kubernetes.io/projected/03f989e6-7b3a-4a45-ae07-e32c73679277-kube-api-access-snb4r\") pod \"collect-profiles-29467770-2hplx\" (UID: \"03f989e6-7b3a-4a45-ae07-e32c73679277\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467770-2hplx" Jan 10 17:30:00 crc kubenswrapper[5036]: I0110 17:30:00.467740 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03f989e6-7b3a-4a45-ae07-e32c73679277-config-volume\") pod \"collect-profiles-29467770-2hplx\" (UID: \"03f989e6-7b3a-4a45-ae07-e32c73679277\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467770-2hplx" Jan 10 17:30:00 crc kubenswrapper[5036]: I0110 17:30:00.476940 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03f989e6-7b3a-4a45-ae07-e32c73679277-secret-volume\") pod \"collect-profiles-29467770-2hplx\" (UID: \"03f989e6-7b3a-4a45-ae07-e32c73679277\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467770-2hplx" Jan 10 17:30:00 crc kubenswrapper[5036]: I0110 17:30:00.483436 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snb4r\" (UniqueName: \"kubernetes.io/projected/03f989e6-7b3a-4a45-ae07-e32c73679277-kube-api-access-snb4r\") pod \"collect-profiles-29467770-2hplx\" (UID: \"03f989e6-7b3a-4a45-ae07-e32c73679277\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29467770-2hplx" Jan 10 17:30:00 crc kubenswrapper[5036]: I0110 17:30:00.567848 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29467770-2hplx" Jan 10 17:30:01 crc kubenswrapper[5036]: W0110 17:30:01.044082 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03f989e6_7b3a_4a45_ae07_e32c73679277.slice/crio-3891a290d3e7dad7495f91b0dd5703699a0b7e949bd13aeba32fc710dc58ca53 WatchSource:0}: Error finding container 3891a290d3e7dad7495f91b0dd5703699a0b7e949bd13aeba32fc710dc58ca53: Status 404 returned error can't find the container with id 3891a290d3e7dad7495f91b0dd5703699a0b7e949bd13aeba32fc710dc58ca53 Jan 10 17:30:01 crc kubenswrapper[5036]: I0110 17:30:01.044182 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29467770-2hplx"] Jan 10 17:30:01 crc kubenswrapper[5036]: I0110 17:30:01.138002 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29467770-2hplx" event={"ID":"03f989e6-7b3a-4a45-ae07-e32c73679277","Type":"ContainerStarted","Data":"3891a290d3e7dad7495f91b0dd5703699a0b7e949bd13aeba32fc710dc58ca53"} Jan 10 17:30:02 crc kubenswrapper[5036]: I0110 17:30:02.159433 5036 generic.go:334] "Generic (PLEG): container finished" podID="03f989e6-7b3a-4a45-ae07-e32c73679277" containerID="8ba9016eade3075c702b6ec24e32d3dc69d636961ab93b81c5b3279c8f042939" exitCode=0 Jan 10 17:30:02 crc kubenswrapper[5036]: I0110 17:30:02.159507 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29467770-2hplx" event={"ID":"03f989e6-7b3a-4a45-ae07-e32c73679277","Type":"ContainerDied","Data":"8ba9016eade3075c702b6ec24e32d3dc69d636961ab93b81c5b3279c8f042939"} Jan 10 17:30:03 crc kubenswrapper[5036]: I0110 17:30:03.537556 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29467770-2hplx" Jan 10 17:30:03 crc kubenswrapper[5036]: I0110 17:30:03.636368 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03f989e6-7b3a-4a45-ae07-e32c73679277-config-volume\") pod \"03f989e6-7b3a-4a45-ae07-e32c73679277\" (UID: \"03f989e6-7b3a-4a45-ae07-e32c73679277\") " Jan 10 17:30:03 crc kubenswrapper[5036]: I0110 17:30:03.636454 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snb4r\" (UniqueName: \"kubernetes.io/projected/03f989e6-7b3a-4a45-ae07-e32c73679277-kube-api-access-snb4r\") pod \"03f989e6-7b3a-4a45-ae07-e32c73679277\" (UID: \"03f989e6-7b3a-4a45-ae07-e32c73679277\") " Jan 10 17:30:03 crc kubenswrapper[5036]: I0110 17:30:03.636575 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03f989e6-7b3a-4a45-ae07-e32c73679277-secret-volume\") pod \"03f989e6-7b3a-4a45-ae07-e32c73679277\" (UID: \"03f989e6-7b3a-4a45-ae07-e32c73679277\") " Jan 10 17:30:03 crc kubenswrapper[5036]: I0110 17:30:03.637238 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03f989e6-7b3a-4a45-ae07-e32c73679277-config-volume" (OuterVolumeSpecName: "config-volume") pod "03f989e6-7b3a-4a45-ae07-e32c73679277" (UID: "03f989e6-7b3a-4a45-ae07-e32c73679277"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 10 17:30:03 crc kubenswrapper[5036]: I0110 17:30:03.643938 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03f989e6-7b3a-4a45-ae07-e32c73679277-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "03f989e6-7b3a-4a45-ae07-e32c73679277" (UID: "03f989e6-7b3a-4a45-ae07-e32c73679277"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 10 17:30:03 crc kubenswrapper[5036]: I0110 17:30:03.644026 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03f989e6-7b3a-4a45-ae07-e32c73679277-kube-api-access-snb4r" (OuterVolumeSpecName: "kube-api-access-snb4r") pod "03f989e6-7b3a-4a45-ae07-e32c73679277" (UID: "03f989e6-7b3a-4a45-ae07-e32c73679277"). InnerVolumeSpecName "kube-api-access-snb4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:30:03 crc kubenswrapper[5036]: I0110 17:30:03.739200 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snb4r\" (UniqueName: \"kubernetes.io/projected/03f989e6-7b3a-4a45-ae07-e32c73679277-kube-api-access-snb4r\") on node \"crc\" DevicePath \"\"" Jan 10 17:30:03 crc kubenswrapper[5036]: I0110 17:30:03.739252 5036 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03f989e6-7b3a-4a45-ae07-e32c73679277-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 10 17:30:03 crc kubenswrapper[5036]: I0110 17:30:03.739268 5036 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03f989e6-7b3a-4a45-ae07-e32c73679277-config-volume\") on node \"crc\" DevicePath \"\"" Jan 10 17:30:04 crc kubenswrapper[5036]: I0110 17:30:04.180592 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29467770-2hplx" event={"ID":"03f989e6-7b3a-4a45-ae07-e32c73679277","Type":"ContainerDied","Data":"3891a290d3e7dad7495f91b0dd5703699a0b7e949bd13aeba32fc710dc58ca53"} Jan 10 17:30:04 crc kubenswrapper[5036]: I0110 17:30:04.180648 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3891a290d3e7dad7495f91b0dd5703699a0b7e949bd13aeba32fc710dc58ca53" Jan 10 17:30:04 crc kubenswrapper[5036]: I0110 17:30:04.181165 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29467770-2hplx" Jan 10 17:30:04 crc kubenswrapper[5036]: I0110 17:30:04.656244 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29467725-5xzc2"] Jan 10 17:30:04 crc kubenswrapper[5036]: I0110 17:30:04.667123 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29467725-5xzc2"] Jan 10 17:30:06 crc kubenswrapper[5036]: I0110 17:30:06.527751 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d1b58ad-b491-4354-a0b0-3ab868370dc9" path="/var/lib/kubelet/pods/7d1b58ad-b491-4354-a0b0-3ab868370dc9/volumes" Jan 10 17:30:25 crc kubenswrapper[5036]: I0110 17:30:25.275215 5036 scope.go:117] "RemoveContainer" containerID="b16a0b2c30b5bc3790ac01f824c82c020f74640e6ba51070b62457f494fd2647" Jan 10 17:30:55 crc kubenswrapper[5036]: I0110 17:30:55.904578 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 17:30:55 crc kubenswrapper[5036]: I0110 17:30:55.905207 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 17:31:25 crc kubenswrapper[5036]: I0110 17:31:25.903807 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 17:31:25 crc kubenswrapper[5036]: I0110 17:31:25.904415 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 17:31:37 crc kubenswrapper[5036]: I0110 17:31:37.164086 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9xk8v/must-gather-x4bc8"] Jan 10 17:31:37 crc kubenswrapper[5036]: E0110 17:31:37.165205 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f989e6-7b3a-4a45-ae07-e32c73679277" containerName="collect-profiles" Jan 10 17:31:37 crc kubenswrapper[5036]: I0110 17:31:37.165222 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f989e6-7b3a-4a45-ae07-e32c73679277" containerName="collect-profiles" Jan 10 17:31:37 crc kubenswrapper[5036]: I0110 17:31:37.165466 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="03f989e6-7b3a-4a45-ae07-e32c73679277" containerName="collect-profiles" Jan 10 17:31:37 crc kubenswrapper[5036]: I0110 17:31:37.166638 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xk8v/must-gather-x4bc8" Jan 10 17:31:37 crc kubenswrapper[5036]: I0110 17:31:37.174428 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9xk8v/must-gather-x4bc8"] Jan 10 17:31:37 crc kubenswrapper[5036]: I0110 17:31:37.233658 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9xk8v"/"openshift-service-ca.crt" Jan 10 17:31:37 crc kubenswrapper[5036]: I0110 17:31:37.233933 5036 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9xk8v"/"kube-root-ca.crt" Jan 10 17:31:37 crc kubenswrapper[5036]: I0110 17:31:37.234981 5036 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9xk8v"/"default-dockercfg-jdj57" Jan 10 17:31:37 crc kubenswrapper[5036]: I0110 17:31:37.259270 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/689c3546-b16e-4265-9e6e-57ce3915c006-must-gather-output\") pod \"must-gather-x4bc8\" (UID: \"689c3546-b16e-4265-9e6e-57ce3915c006\") " pod="openshift-must-gather-9xk8v/must-gather-x4bc8" Jan 10 17:31:37 crc kubenswrapper[5036]: I0110 17:31:37.259334 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9qtd\" (UniqueName: \"kubernetes.io/projected/689c3546-b16e-4265-9e6e-57ce3915c006-kube-api-access-x9qtd\") pod \"must-gather-x4bc8\" (UID: \"689c3546-b16e-4265-9e6e-57ce3915c006\") " pod="openshift-must-gather-9xk8v/must-gather-x4bc8" Jan 10 17:31:37 crc kubenswrapper[5036]: I0110 17:31:37.360873 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/689c3546-b16e-4265-9e6e-57ce3915c006-must-gather-output\") pod \"must-gather-x4bc8\" (UID: \"689c3546-b16e-4265-9e6e-57ce3915c006\") " pod="openshift-must-gather-9xk8v/must-gather-x4bc8" Jan 10 17:31:37 crc kubenswrapper[5036]: I0110 17:31:37.360933 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9qtd\" (UniqueName: \"kubernetes.io/projected/689c3546-b16e-4265-9e6e-57ce3915c006-kube-api-access-x9qtd\") pod \"must-gather-x4bc8\" (UID: \"689c3546-b16e-4265-9e6e-57ce3915c006\") " pod="openshift-must-gather-9xk8v/must-gather-x4bc8" Jan 10 17:31:37 crc kubenswrapper[5036]: I0110 17:31:37.361409 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/689c3546-b16e-4265-9e6e-57ce3915c006-must-gather-output\") pod \"must-gather-x4bc8\" (UID: \"689c3546-b16e-4265-9e6e-57ce3915c006\") " pod="openshift-must-gather-9xk8v/must-gather-x4bc8" Jan 10 17:31:37 crc kubenswrapper[5036]: I0110 17:31:37.379752 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9qtd\" (UniqueName: \"kubernetes.io/projected/689c3546-b16e-4265-9e6e-57ce3915c006-kube-api-access-x9qtd\") pod \"must-gather-x4bc8\" (UID: \"689c3546-b16e-4265-9e6e-57ce3915c006\") " pod="openshift-must-gather-9xk8v/must-gather-x4bc8" Jan 10 17:31:37 crc kubenswrapper[5036]: I0110 17:31:37.557549 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xk8v/must-gather-x4bc8" Jan 10 17:31:38 crc kubenswrapper[5036]: I0110 17:31:38.001799 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9xk8v/must-gather-x4bc8"] Jan 10 17:31:38 crc kubenswrapper[5036]: I0110 17:31:38.259390 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9xk8v/must-gather-x4bc8" event={"ID":"689c3546-b16e-4265-9e6e-57ce3915c006","Type":"ContainerStarted","Data":"345baad2d2086a573a5d39051b57f2df3d30a0cb100f4172ebca1b07359fe7e1"} Jan 10 17:31:39 crc kubenswrapper[5036]: I0110 17:31:39.270010 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9xk8v/must-gather-x4bc8" event={"ID":"689c3546-b16e-4265-9e6e-57ce3915c006","Type":"ContainerStarted","Data":"ed6c84dd9322b966a9bcbeac915c61b4480c44e10f9a68a1c9abda1a5ee0740f"} Jan 10 17:31:39 crc kubenswrapper[5036]: I0110 17:31:39.270402 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9xk8v/must-gather-x4bc8" event={"ID":"689c3546-b16e-4265-9e6e-57ce3915c006","Type":"ContainerStarted","Data":"a8d3bd8cc11e3892c16acaa91326238c71e75b5a9024f216405f776357c99034"} Jan 10 17:31:39 crc kubenswrapper[5036]: I0110 17:31:39.300819 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9xk8v/must-gather-x4bc8" podStartSLOduration=2.300800802 podStartE2EDuration="2.300800802s" podCreationTimestamp="2026-01-10 17:31:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 17:31:39.285440097 +0000 UTC m=+3821.155675601" watchObservedRunningTime="2026-01-10 17:31:39.300800802 +0000 UTC m=+3821.171036296" Jan 10 17:31:43 crc kubenswrapper[5036]: I0110 17:31:43.307729 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9xk8v/crc-debug-jr6jq"] Jan 10 17:31:43 crc kubenswrapper[5036]: I0110 17:31:43.315316 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xk8v/crc-debug-jr6jq" Jan 10 17:31:43 crc kubenswrapper[5036]: I0110 17:31:43.398020 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kppzt\" (UniqueName: \"kubernetes.io/projected/aff15acd-5067-43b5-b570-2275a4c08c21-kube-api-access-kppzt\") pod \"crc-debug-jr6jq\" (UID: \"aff15acd-5067-43b5-b570-2275a4c08c21\") " pod="openshift-must-gather-9xk8v/crc-debug-jr6jq" Jan 10 17:31:43 crc kubenswrapper[5036]: I0110 17:31:43.398124 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aff15acd-5067-43b5-b570-2275a4c08c21-host\") pod \"crc-debug-jr6jq\" (UID: \"aff15acd-5067-43b5-b570-2275a4c08c21\") " pod="openshift-must-gather-9xk8v/crc-debug-jr6jq" Jan 10 17:31:43 crc kubenswrapper[5036]: I0110 17:31:43.500403 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aff15acd-5067-43b5-b570-2275a4c08c21-host\") pod \"crc-debug-jr6jq\" (UID: \"aff15acd-5067-43b5-b570-2275a4c08c21\") " pod="openshift-must-gather-9xk8v/crc-debug-jr6jq" Jan 10 17:31:43 crc kubenswrapper[5036]: I0110 17:31:43.500565 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aff15acd-5067-43b5-b570-2275a4c08c21-host\") pod \"crc-debug-jr6jq\" (UID: \"aff15acd-5067-43b5-b570-2275a4c08c21\") " pod="openshift-must-gather-9xk8v/crc-debug-jr6jq" Jan 10 17:31:43 crc kubenswrapper[5036]: I0110 17:31:43.500600 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kppzt\" (UniqueName: \"kubernetes.io/projected/aff15acd-5067-43b5-b570-2275a4c08c21-kube-api-access-kppzt\") pod \"crc-debug-jr6jq\" (UID: \"aff15acd-5067-43b5-b570-2275a4c08c21\") " pod="openshift-must-gather-9xk8v/crc-debug-jr6jq" Jan 10 17:31:43 crc kubenswrapper[5036]: I0110 17:31:43.522825 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kppzt\" (UniqueName: \"kubernetes.io/projected/aff15acd-5067-43b5-b570-2275a4c08c21-kube-api-access-kppzt\") pod \"crc-debug-jr6jq\" (UID: \"aff15acd-5067-43b5-b570-2275a4c08c21\") " pod="openshift-must-gather-9xk8v/crc-debug-jr6jq" Jan 10 17:31:43 crc kubenswrapper[5036]: I0110 17:31:43.640568 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xk8v/crc-debug-jr6jq" Jan 10 17:31:44 crc kubenswrapper[5036]: I0110 17:31:44.340635 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9xk8v/crc-debug-jr6jq" event={"ID":"aff15acd-5067-43b5-b570-2275a4c08c21","Type":"ContainerStarted","Data":"f1f3b021646b558abb05629a929e5420aa688a0eba9ff529894b440c9532b1d4"} Jan 10 17:31:44 crc kubenswrapper[5036]: I0110 17:31:44.341101 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9xk8v/crc-debug-jr6jq" event={"ID":"aff15acd-5067-43b5-b570-2275a4c08c21","Type":"ContainerStarted","Data":"7875511d9a37ccf3f7d7c8f674c482c5b9352697f54cd177e3c225471accb0d7"} Jan 10 17:31:44 crc kubenswrapper[5036]: I0110 17:31:44.365569 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9xk8v/crc-debug-jr6jq" podStartSLOduration=1.36555153 podStartE2EDuration="1.36555153s" podCreationTimestamp="2026-01-10 17:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 17:31:44.358749177 +0000 UTC m=+3826.228984671" watchObservedRunningTime="2026-01-10 17:31:44.36555153 +0000 UTC m=+3826.235787024" Jan 10 17:31:55 crc kubenswrapper[5036]: I0110 17:31:55.904448 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 17:31:55 crc kubenswrapper[5036]: I0110 17:31:55.905200 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 17:31:55 crc kubenswrapper[5036]: I0110 17:31:55.905272 5036 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 17:31:55 crc kubenswrapper[5036]: I0110 17:31:55.906595 5036 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e"} pod="openshift-machine-config-operator/machine-config-daemon-kqphb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 10 17:31:55 crc kubenswrapper[5036]: I0110 17:31:55.906756 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" containerID="cri-o://2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" gracePeriod=600 Jan 10 17:31:56 crc kubenswrapper[5036]: E0110 17:31:56.029156 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:31:56 crc kubenswrapper[5036]: I0110 17:31:56.439737 5036 generic.go:334] "Generic (PLEG): container finished" podID="79756361-741e-4470-831b-6ee092bc6277" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" exitCode=0 Jan 10 17:31:56 crc kubenswrapper[5036]: I0110 17:31:56.439811 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerDied","Data":"2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e"} Jan 10 17:31:56 crc kubenswrapper[5036]: I0110 17:31:56.440116 5036 scope.go:117] "RemoveContainer" containerID="818dec2ff2a2b1cc25943b2daf0369fb8c17b58d5317d976bd7f432c5df76134" Jan 10 17:31:56 crc kubenswrapper[5036]: I0110 17:31:56.440806 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" Jan 10 17:31:56 crc kubenswrapper[5036]: E0110 17:31:56.441108 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:31:57 crc kubenswrapper[5036]: I0110 17:31:57.259605 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-46mnx"] Jan 10 17:31:57 crc kubenswrapper[5036]: I0110 17:31:57.279270 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-46mnx" Jan 10 17:31:57 crc kubenswrapper[5036]: I0110 17:31:57.307065 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-46mnx"] Jan 10 17:31:57 crc kubenswrapper[5036]: I0110 17:31:57.361712 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9f84\" (UniqueName: \"kubernetes.io/projected/51614388-435d-4c2b-acaa-c57f8eb1add4-kube-api-access-v9f84\") pod \"certified-operators-46mnx\" (UID: \"51614388-435d-4c2b-acaa-c57f8eb1add4\") " pod="openshift-marketplace/certified-operators-46mnx" Jan 10 17:31:57 crc kubenswrapper[5036]: I0110 17:31:57.362057 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51614388-435d-4c2b-acaa-c57f8eb1add4-utilities\") pod \"certified-operators-46mnx\" (UID: \"51614388-435d-4c2b-acaa-c57f8eb1add4\") " pod="openshift-marketplace/certified-operators-46mnx" Jan 10 17:31:57 crc kubenswrapper[5036]: I0110 17:31:57.362096 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51614388-435d-4c2b-acaa-c57f8eb1add4-catalog-content\") pod \"certified-operators-46mnx\" (UID: \"51614388-435d-4c2b-acaa-c57f8eb1add4\") " pod="openshift-marketplace/certified-operators-46mnx" Jan 10 17:31:57 crc kubenswrapper[5036]: I0110 17:31:57.463797 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9f84\" (UniqueName: \"kubernetes.io/projected/51614388-435d-4c2b-acaa-c57f8eb1add4-kube-api-access-v9f84\") pod \"certified-operators-46mnx\" (UID: \"51614388-435d-4c2b-acaa-c57f8eb1add4\") " pod="openshift-marketplace/certified-operators-46mnx" Jan 10 17:31:57 crc kubenswrapper[5036]: I0110 17:31:57.463890 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51614388-435d-4c2b-acaa-c57f8eb1add4-utilities\") pod \"certified-operators-46mnx\" (UID: \"51614388-435d-4c2b-acaa-c57f8eb1add4\") " pod="openshift-marketplace/certified-operators-46mnx" Jan 10 17:31:57 crc kubenswrapper[5036]: I0110 17:31:57.463930 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51614388-435d-4c2b-acaa-c57f8eb1add4-catalog-content\") pod \"certified-operators-46mnx\" (UID: \"51614388-435d-4c2b-acaa-c57f8eb1add4\") " pod="openshift-marketplace/certified-operators-46mnx" Jan 10 17:31:57 crc kubenswrapper[5036]: I0110 17:31:57.464433 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51614388-435d-4c2b-acaa-c57f8eb1add4-catalog-content\") pod \"certified-operators-46mnx\" (UID: \"51614388-435d-4c2b-acaa-c57f8eb1add4\") " pod="openshift-marketplace/certified-operators-46mnx" Jan 10 17:31:57 crc kubenswrapper[5036]: I0110 17:31:57.464993 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51614388-435d-4c2b-acaa-c57f8eb1add4-utilities\") pod \"certified-operators-46mnx\" (UID: \"51614388-435d-4c2b-acaa-c57f8eb1add4\") " pod="openshift-marketplace/certified-operators-46mnx" Jan 10 17:31:57 crc kubenswrapper[5036]: I0110 17:31:57.487161 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9f84\" (UniqueName: \"kubernetes.io/projected/51614388-435d-4c2b-acaa-c57f8eb1add4-kube-api-access-v9f84\") pod \"certified-operators-46mnx\" (UID: \"51614388-435d-4c2b-acaa-c57f8eb1add4\") " pod="openshift-marketplace/certified-operators-46mnx" Jan 10 17:31:57 crc kubenswrapper[5036]: I0110 17:31:57.629398 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-46mnx" Jan 10 17:31:58 crc kubenswrapper[5036]: I0110 17:31:58.133992 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-46mnx"] Jan 10 17:31:58 crc kubenswrapper[5036]: I0110 17:31:58.462460 5036 generic.go:334] "Generic (PLEG): container finished" podID="51614388-435d-4c2b-acaa-c57f8eb1add4" containerID="8f17873de3a838cec6d7ccc126b02773e93b8836f29e2234204b0172ad4e9c6c" exitCode=0 Jan 10 17:31:58 crc kubenswrapper[5036]: I0110 17:31:58.462587 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46mnx" event={"ID":"51614388-435d-4c2b-acaa-c57f8eb1add4","Type":"ContainerDied","Data":"8f17873de3a838cec6d7ccc126b02773e93b8836f29e2234204b0172ad4e9c6c"} Jan 10 17:31:58 crc kubenswrapper[5036]: I0110 17:31:58.464304 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46mnx" event={"ID":"51614388-435d-4c2b-acaa-c57f8eb1add4","Type":"ContainerStarted","Data":"67f3b62af608c7ca12c913abb6902c15656f89669f395a9c65e0f8248cfdf6cd"} Jan 10 17:31:58 crc kubenswrapper[5036]: I0110 17:31:58.468144 5036 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 10 17:31:59 crc kubenswrapper[5036]: I0110 17:31:59.474680 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46mnx" event={"ID":"51614388-435d-4c2b-acaa-c57f8eb1add4","Type":"ContainerStarted","Data":"f8df9c4bf82431c198010ff33aa75a9615fdae113af9b8e769410e44edc416e6"} Jan 10 17:32:00 crc kubenswrapper[5036]: I0110 17:32:00.487843 5036 generic.go:334] "Generic (PLEG): container finished" podID="51614388-435d-4c2b-acaa-c57f8eb1add4" containerID="f8df9c4bf82431c198010ff33aa75a9615fdae113af9b8e769410e44edc416e6" exitCode=0 Jan 10 17:32:00 crc kubenswrapper[5036]: I0110 17:32:00.487936 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46mnx" event={"ID":"51614388-435d-4c2b-acaa-c57f8eb1add4","Type":"ContainerDied","Data":"f8df9c4bf82431c198010ff33aa75a9615fdae113af9b8e769410e44edc416e6"} Jan 10 17:32:02 crc kubenswrapper[5036]: I0110 17:32:02.519976 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46mnx" event={"ID":"51614388-435d-4c2b-acaa-c57f8eb1add4","Type":"ContainerStarted","Data":"4e3ef8b194ac151f09190dab31c348f9fd1f383e3d7e4bba4a38812fe0f027f3"} Jan 10 17:32:02 crc kubenswrapper[5036]: I0110 17:32:02.539693 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-46mnx" podStartSLOduration=2.506791648 podStartE2EDuration="5.539655114s" podCreationTimestamp="2026-01-10 17:31:57 +0000 UTC" firstStartedPulling="2026-01-10 17:31:58.466888438 +0000 UTC m=+3840.337123932" lastFinishedPulling="2026-01-10 17:32:01.499751904 +0000 UTC m=+3843.369987398" observedRunningTime="2026-01-10 17:32:02.531072711 +0000 UTC m=+3844.401308205" watchObservedRunningTime="2026-01-10 17:32:02.539655114 +0000 UTC m=+3844.409890608" Jan 10 17:32:07 crc kubenswrapper[5036]: I0110 17:32:07.633042 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-46mnx" Jan 10 17:32:07 crc kubenswrapper[5036]: I0110 17:32:07.633623 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-46mnx" Jan 10 17:32:07 crc kubenswrapper[5036]: I0110 17:32:07.681693 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-46mnx" Jan 10 17:32:08 crc kubenswrapper[5036]: I0110 17:32:08.621177 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-46mnx" Jan 10 17:32:08 crc kubenswrapper[5036]: I0110 17:32:08.674279 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-46mnx"] Jan 10 17:32:09 crc kubenswrapper[5036]: I0110 17:32:09.508353 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" Jan 10 17:32:09 crc kubenswrapper[5036]: E0110 17:32:09.508906 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:32:10 crc kubenswrapper[5036]: I0110 17:32:10.569754 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-46mnx" podUID="51614388-435d-4c2b-acaa-c57f8eb1add4" containerName="registry-server" containerID="cri-o://4e3ef8b194ac151f09190dab31c348f9fd1f383e3d7e4bba4a38812fe0f027f3" gracePeriod=2 Jan 10 17:32:11 crc kubenswrapper[5036]: I0110 17:32:11.587046 5036 generic.go:334] "Generic (PLEG): container finished" podID="51614388-435d-4c2b-acaa-c57f8eb1add4" containerID="4e3ef8b194ac151f09190dab31c348f9fd1f383e3d7e4bba4a38812fe0f027f3" exitCode=0 Jan 10 17:32:11 crc kubenswrapper[5036]: I0110 17:32:11.587427 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46mnx" event={"ID":"51614388-435d-4c2b-acaa-c57f8eb1add4","Type":"ContainerDied","Data":"4e3ef8b194ac151f09190dab31c348f9fd1f383e3d7e4bba4a38812fe0f027f3"} Jan 10 17:32:11 crc kubenswrapper[5036]: I0110 17:32:11.587619 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-46mnx" event={"ID":"51614388-435d-4c2b-acaa-c57f8eb1add4","Type":"ContainerDied","Data":"67f3b62af608c7ca12c913abb6902c15656f89669f395a9c65e0f8248cfdf6cd"} Jan 10 17:32:11 crc kubenswrapper[5036]: I0110 17:32:11.587639 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67f3b62af608c7ca12c913abb6902c15656f89669f395a9c65e0f8248cfdf6cd" Jan 10 17:32:11 crc kubenswrapper[5036]: I0110 17:32:11.677527 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-46mnx" Jan 10 17:32:11 crc kubenswrapper[5036]: I0110 17:32:11.869998 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51614388-435d-4c2b-acaa-c57f8eb1add4-catalog-content\") pod \"51614388-435d-4c2b-acaa-c57f8eb1add4\" (UID: \"51614388-435d-4c2b-acaa-c57f8eb1add4\") " Jan 10 17:32:11 crc kubenswrapper[5036]: I0110 17:32:11.871902 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9f84\" (UniqueName: \"kubernetes.io/projected/51614388-435d-4c2b-acaa-c57f8eb1add4-kube-api-access-v9f84\") pod \"51614388-435d-4c2b-acaa-c57f8eb1add4\" (UID: \"51614388-435d-4c2b-acaa-c57f8eb1add4\") " Jan 10 17:32:11 crc kubenswrapper[5036]: I0110 17:32:11.872183 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51614388-435d-4c2b-acaa-c57f8eb1add4-utilities\") pod \"51614388-435d-4c2b-acaa-c57f8eb1add4\" (UID: \"51614388-435d-4c2b-acaa-c57f8eb1add4\") " Jan 10 17:32:11 crc kubenswrapper[5036]: I0110 17:32:11.872746 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51614388-435d-4c2b-acaa-c57f8eb1add4-utilities" (OuterVolumeSpecName: "utilities") pod "51614388-435d-4c2b-acaa-c57f8eb1add4" (UID: "51614388-435d-4c2b-acaa-c57f8eb1add4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:32:11 crc kubenswrapper[5036]: I0110 17:32:11.873627 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51614388-435d-4c2b-acaa-c57f8eb1add4-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 17:32:11 crc kubenswrapper[5036]: I0110 17:32:11.878891 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51614388-435d-4c2b-acaa-c57f8eb1add4-kube-api-access-v9f84" (OuterVolumeSpecName: "kube-api-access-v9f84") pod "51614388-435d-4c2b-acaa-c57f8eb1add4" (UID: "51614388-435d-4c2b-acaa-c57f8eb1add4"). InnerVolumeSpecName "kube-api-access-v9f84". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:32:11 crc kubenswrapper[5036]: I0110 17:32:11.923571 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51614388-435d-4c2b-acaa-c57f8eb1add4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51614388-435d-4c2b-acaa-c57f8eb1add4" (UID: "51614388-435d-4c2b-acaa-c57f8eb1add4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:32:11 crc kubenswrapper[5036]: I0110 17:32:11.975898 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51614388-435d-4c2b-acaa-c57f8eb1add4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 17:32:11 crc kubenswrapper[5036]: I0110 17:32:11.976165 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9f84\" (UniqueName: \"kubernetes.io/projected/51614388-435d-4c2b-acaa-c57f8eb1add4-kube-api-access-v9f84\") on node \"crc\" DevicePath \"\"" Jan 10 17:32:12 crc kubenswrapper[5036]: I0110 17:32:12.596856 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-46mnx" Jan 10 17:32:12 crc kubenswrapper[5036]: I0110 17:32:12.623635 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-46mnx"] Jan 10 17:32:12 crc kubenswrapper[5036]: I0110 17:32:12.649033 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-46mnx"] Jan 10 17:32:14 crc kubenswrapper[5036]: I0110 17:32:14.520216 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51614388-435d-4c2b-acaa-c57f8eb1add4" path="/var/lib/kubelet/pods/51614388-435d-4c2b-acaa-c57f8eb1add4/volumes" Jan 10 17:32:17 crc kubenswrapper[5036]: I0110 17:32:17.666057 5036 generic.go:334] "Generic (PLEG): container finished" podID="aff15acd-5067-43b5-b570-2275a4c08c21" containerID="f1f3b021646b558abb05629a929e5420aa688a0eba9ff529894b440c9532b1d4" exitCode=0 Jan 10 17:32:17 crc kubenswrapper[5036]: I0110 17:32:17.666147 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9xk8v/crc-debug-jr6jq" event={"ID":"aff15acd-5067-43b5-b570-2275a4c08c21","Type":"ContainerDied","Data":"f1f3b021646b558abb05629a929e5420aa688a0eba9ff529894b440c9532b1d4"} Jan 10 17:32:18 crc kubenswrapper[5036]: I0110 17:32:18.788465 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xk8v/crc-debug-jr6jq" Jan 10 17:32:18 crc kubenswrapper[5036]: I0110 17:32:18.817310 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9xk8v/crc-debug-jr6jq"] Jan 10 17:32:18 crc kubenswrapper[5036]: I0110 17:32:18.830490 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9xk8v/crc-debug-jr6jq"] Jan 10 17:32:18 crc kubenswrapper[5036]: I0110 17:32:18.942717 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kppzt\" (UniqueName: \"kubernetes.io/projected/aff15acd-5067-43b5-b570-2275a4c08c21-kube-api-access-kppzt\") pod \"aff15acd-5067-43b5-b570-2275a4c08c21\" (UID: \"aff15acd-5067-43b5-b570-2275a4c08c21\") " Jan 10 17:32:18 crc kubenswrapper[5036]: I0110 17:32:18.943005 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aff15acd-5067-43b5-b570-2275a4c08c21-host\") pod \"aff15acd-5067-43b5-b570-2275a4c08c21\" (UID: \"aff15acd-5067-43b5-b570-2275a4c08c21\") " Jan 10 17:32:18 crc kubenswrapper[5036]: I0110 17:32:18.943144 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aff15acd-5067-43b5-b570-2275a4c08c21-host" (OuterVolumeSpecName: "host") pod "aff15acd-5067-43b5-b570-2275a4c08c21" (UID: "aff15acd-5067-43b5-b570-2275a4c08c21"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 17:32:18 crc kubenswrapper[5036]: I0110 17:32:18.943582 5036 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aff15acd-5067-43b5-b570-2275a4c08c21-host\") on node \"crc\" DevicePath \"\"" Jan 10 17:32:18 crc kubenswrapper[5036]: I0110 17:32:18.949007 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aff15acd-5067-43b5-b570-2275a4c08c21-kube-api-access-kppzt" (OuterVolumeSpecName: "kube-api-access-kppzt") pod "aff15acd-5067-43b5-b570-2275a4c08c21" (UID: "aff15acd-5067-43b5-b570-2275a4c08c21"). InnerVolumeSpecName "kube-api-access-kppzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:32:19 crc kubenswrapper[5036]: I0110 17:32:19.045900 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kppzt\" (UniqueName: \"kubernetes.io/projected/aff15acd-5067-43b5-b570-2275a4c08c21-kube-api-access-kppzt\") on node \"crc\" DevicePath \"\"" Jan 10 17:32:19 crc kubenswrapper[5036]: I0110 17:32:19.698988 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7875511d9a37ccf3f7d7c8f674c482c5b9352697f54cd177e3c225471accb0d7" Jan 10 17:32:19 crc kubenswrapper[5036]: I0110 17:32:19.699056 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xk8v/crc-debug-jr6jq" Jan 10 17:32:20 crc kubenswrapper[5036]: I0110 17:32:20.076365 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9xk8v/crc-debug-gcg7q"] Jan 10 17:32:20 crc kubenswrapper[5036]: E0110 17:32:20.076811 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51614388-435d-4c2b-acaa-c57f8eb1add4" containerName="registry-server" Jan 10 17:32:20 crc kubenswrapper[5036]: I0110 17:32:20.076826 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="51614388-435d-4c2b-acaa-c57f8eb1add4" containerName="registry-server" Jan 10 17:32:20 crc kubenswrapper[5036]: E0110 17:32:20.076854 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51614388-435d-4c2b-acaa-c57f8eb1add4" containerName="extract-content" Jan 10 17:32:20 crc kubenswrapper[5036]: I0110 17:32:20.076862 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="51614388-435d-4c2b-acaa-c57f8eb1add4" containerName="extract-content" Jan 10 17:32:20 crc kubenswrapper[5036]: E0110 17:32:20.076874 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aff15acd-5067-43b5-b570-2275a4c08c21" containerName="container-00" Jan 10 17:32:20 crc kubenswrapper[5036]: I0110 17:32:20.076882 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="aff15acd-5067-43b5-b570-2275a4c08c21" containerName="container-00" Jan 10 17:32:20 crc kubenswrapper[5036]: E0110 17:32:20.076903 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51614388-435d-4c2b-acaa-c57f8eb1add4" containerName="extract-utilities" Jan 10 17:32:20 crc kubenswrapper[5036]: I0110 17:32:20.076910 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="51614388-435d-4c2b-acaa-c57f8eb1add4" containerName="extract-utilities" Jan 10 17:32:20 crc kubenswrapper[5036]: I0110 17:32:20.077206 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="aff15acd-5067-43b5-b570-2275a4c08c21" containerName="container-00" Jan 10 17:32:20 crc kubenswrapper[5036]: I0110 17:32:20.077466 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="51614388-435d-4c2b-acaa-c57f8eb1add4" containerName="registry-server" Jan 10 17:32:20 crc kubenswrapper[5036]: I0110 17:32:20.078268 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xk8v/crc-debug-gcg7q" Jan 10 17:32:20 crc kubenswrapper[5036]: I0110 17:32:20.168545 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34-host\") pod \"crc-debug-gcg7q\" (UID: \"279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34\") " pod="openshift-must-gather-9xk8v/crc-debug-gcg7q" Jan 10 17:32:20 crc kubenswrapper[5036]: I0110 17:32:20.168726 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gs9m\" (UniqueName: \"kubernetes.io/projected/279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34-kube-api-access-6gs9m\") pod \"crc-debug-gcg7q\" (UID: \"279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34\") " pod="openshift-must-gather-9xk8v/crc-debug-gcg7q" Jan 10 17:32:20 crc kubenswrapper[5036]: I0110 17:32:20.270826 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gs9m\" (UniqueName: \"kubernetes.io/projected/279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34-kube-api-access-6gs9m\") pod \"crc-debug-gcg7q\" (UID: \"279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34\") " pod="openshift-must-gather-9xk8v/crc-debug-gcg7q" Jan 10 17:32:20 crc kubenswrapper[5036]: I0110 17:32:20.270946 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34-host\") pod \"crc-debug-gcg7q\" (UID: \"279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34\") " pod="openshift-must-gather-9xk8v/crc-debug-gcg7q" Jan 10 17:32:20 crc kubenswrapper[5036]: I0110 17:32:20.271120 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34-host\") pod \"crc-debug-gcg7q\" (UID: \"279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34\") " pod="openshift-must-gather-9xk8v/crc-debug-gcg7q" Jan 10 17:32:20 crc kubenswrapper[5036]: I0110 17:32:20.291116 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gs9m\" (UniqueName: \"kubernetes.io/projected/279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34-kube-api-access-6gs9m\") pod \"crc-debug-gcg7q\" (UID: \"279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34\") " pod="openshift-must-gather-9xk8v/crc-debug-gcg7q" Jan 10 17:32:20 crc kubenswrapper[5036]: I0110 17:32:20.396620 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xk8v/crc-debug-gcg7q" Jan 10 17:32:20 crc kubenswrapper[5036]: I0110 17:32:20.508237 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" Jan 10 17:32:20 crc kubenswrapper[5036]: E0110 17:32:20.509090 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:32:20 crc kubenswrapper[5036]: I0110 17:32:20.521933 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aff15acd-5067-43b5-b570-2275a4c08c21" path="/var/lib/kubelet/pods/aff15acd-5067-43b5-b570-2275a4c08c21/volumes" Jan 10 17:32:20 crc kubenswrapper[5036]: I0110 17:32:20.707335 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9xk8v/crc-debug-gcg7q" event={"ID":"279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34","Type":"ContainerStarted","Data":"d09bdb3e2f4de3cf032352db23540827ee9fb2540bff92579be768c1a4b7b230"} Jan 10 17:32:20 crc kubenswrapper[5036]: I0110 17:32:20.707374 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9xk8v/crc-debug-gcg7q" event={"ID":"279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34","Type":"ContainerStarted","Data":"b952e3d2fd4f859ead574121b154c9d038100a5837917c055c743889a77d3d1f"} Jan 10 17:32:20 crc kubenswrapper[5036]: I0110 17:32:20.728400 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9xk8v/crc-debug-gcg7q" podStartSLOduration=0.728384383 podStartE2EDuration="728.384383ms" podCreationTimestamp="2026-01-10 17:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-10 17:32:20.72649805 +0000 UTC m=+3862.596733544" watchObservedRunningTime="2026-01-10 17:32:20.728384383 +0000 UTC m=+3862.598619877" Jan 10 17:32:21 crc kubenswrapper[5036]: I0110 17:32:21.720866 5036 generic.go:334] "Generic (PLEG): container finished" podID="279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34" containerID="d09bdb3e2f4de3cf032352db23540827ee9fb2540bff92579be768c1a4b7b230" exitCode=0 Jan 10 17:32:21 crc kubenswrapper[5036]: I0110 17:32:21.720929 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9xk8v/crc-debug-gcg7q" event={"ID":"279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34","Type":"ContainerDied","Data":"d09bdb3e2f4de3cf032352db23540827ee9fb2540bff92579be768c1a4b7b230"} Jan 10 17:32:22 crc kubenswrapper[5036]: I0110 17:32:22.856852 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xk8v/crc-debug-gcg7q" Jan 10 17:32:22 crc kubenswrapper[5036]: I0110 17:32:22.889235 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9xk8v/crc-debug-gcg7q"] Jan 10 17:32:22 crc kubenswrapper[5036]: I0110 17:32:22.901910 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9xk8v/crc-debug-gcg7q"] Jan 10 17:32:23 crc kubenswrapper[5036]: I0110 17:32:23.017801 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34-host\") pod \"279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34\" (UID: \"279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34\") " Jan 10 17:32:23 crc kubenswrapper[5036]: I0110 17:32:23.018334 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gs9m\" (UniqueName: \"kubernetes.io/projected/279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34-kube-api-access-6gs9m\") pod \"279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34\" (UID: \"279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34\") " Jan 10 17:32:23 crc kubenswrapper[5036]: I0110 17:32:23.018225 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34-host" (OuterVolumeSpecName: "host") pod "279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34" (UID: "279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 17:32:23 crc kubenswrapper[5036]: I0110 17:32:23.019827 5036 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34-host\") on node \"crc\" DevicePath \"\"" Jan 10 17:32:23 crc kubenswrapper[5036]: I0110 17:32:23.025216 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34-kube-api-access-6gs9m" (OuterVolumeSpecName: "kube-api-access-6gs9m") pod "279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34" (UID: "279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34"). InnerVolumeSpecName "kube-api-access-6gs9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:32:23 crc kubenswrapper[5036]: I0110 17:32:23.122617 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gs9m\" (UniqueName: \"kubernetes.io/projected/279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34-kube-api-access-6gs9m\") on node \"crc\" DevicePath \"\"" Jan 10 17:32:23 crc kubenswrapper[5036]: I0110 17:32:23.740305 5036 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b952e3d2fd4f859ead574121b154c9d038100a5837917c055c743889a77d3d1f" Jan 10 17:32:23 crc kubenswrapper[5036]: I0110 17:32:23.740397 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xk8v/crc-debug-gcg7q" Jan 10 17:32:24 crc kubenswrapper[5036]: I0110 17:32:24.065731 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9xk8v/crc-debug-77rjq"] Jan 10 17:32:24 crc kubenswrapper[5036]: E0110 17:32:24.066107 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34" containerName="container-00" Jan 10 17:32:24 crc kubenswrapper[5036]: I0110 17:32:24.066120 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34" containerName="container-00" Jan 10 17:32:24 crc kubenswrapper[5036]: I0110 17:32:24.066329 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34" containerName="container-00" Jan 10 17:32:24 crc kubenswrapper[5036]: I0110 17:32:24.067132 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xk8v/crc-debug-77rjq" Jan 10 17:32:24 crc kubenswrapper[5036]: I0110 17:32:24.140765 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wvpb\" (UniqueName: \"kubernetes.io/projected/23255537-26c8-4df3-8c53-bb9f27a186b6-kube-api-access-2wvpb\") pod \"crc-debug-77rjq\" (UID: \"23255537-26c8-4df3-8c53-bb9f27a186b6\") " pod="openshift-must-gather-9xk8v/crc-debug-77rjq" Jan 10 17:32:24 crc kubenswrapper[5036]: I0110 17:32:24.141095 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23255537-26c8-4df3-8c53-bb9f27a186b6-host\") pod \"crc-debug-77rjq\" (UID: \"23255537-26c8-4df3-8c53-bb9f27a186b6\") " pod="openshift-must-gather-9xk8v/crc-debug-77rjq" Jan 10 17:32:24 crc kubenswrapper[5036]: I0110 17:32:24.243922 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23255537-26c8-4df3-8c53-bb9f27a186b6-host\") pod \"crc-debug-77rjq\" (UID: \"23255537-26c8-4df3-8c53-bb9f27a186b6\") " pod="openshift-must-gather-9xk8v/crc-debug-77rjq" Jan 10 17:32:24 crc kubenswrapper[5036]: I0110 17:32:24.244054 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23255537-26c8-4df3-8c53-bb9f27a186b6-host\") pod \"crc-debug-77rjq\" (UID: \"23255537-26c8-4df3-8c53-bb9f27a186b6\") " pod="openshift-must-gather-9xk8v/crc-debug-77rjq" Jan 10 17:32:24 crc kubenswrapper[5036]: I0110 17:32:24.244409 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wvpb\" (UniqueName: \"kubernetes.io/projected/23255537-26c8-4df3-8c53-bb9f27a186b6-kube-api-access-2wvpb\") pod \"crc-debug-77rjq\" (UID: \"23255537-26c8-4df3-8c53-bb9f27a186b6\") " pod="openshift-must-gather-9xk8v/crc-debug-77rjq" Jan 10 17:32:24 crc kubenswrapper[5036]: I0110 17:32:24.264467 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wvpb\" (UniqueName: \"kubernetes.io/projected/23255537-26c8-4df3-8c53-bb9f27a186b6-kube-api-access-2wvpb\") pod \"crc-debug-77rjq\" (UID: \"23255537-26c8-4df3-8c53-bb9f27a186b6\") " pod="openshift-must-gather-9xk8v/crc-debug-77rjq" Jan 10 17:32:24 crc kubenswrapper[5036]: I0110 17:32:24.383416 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xk8v/crc-debug-77rjq" Jan 10 17:32:24 crc kubenswrapper[5036]: W0110 17:32:24.411095 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23255537_26c8_4df3_8c53_bb9f27a186b6.slice/crio-bf22e73c2fb314184f940176e4e07aafc945c39a8293aba5f21c90d7ce810b4c WatchSource:0}: Error finding container bf22e73c2fb314184f940176e4e07aafc945c39a8293aba5f21c90d7ce810b4c: Status 404 returned error can't find the container with id bf22e73c2fb314184f940176e4e07aafc945c39a8293aba5f21c90d7ce810b4c Jan 10 17:32:24 crc kubenswrapper[5036]: I0110 17:32:24.518593 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34" path="/var/lib/kubelet/pods/279b82ae-4ffc-4e0f-8e4d-f84c11aa9a34/volumes" Jan 10 17:32:24 crc kubenswrapper[5036]: I0110 17:32:24.755176 5036 generic.go:334] "Generic (PLEG): container finished" podID="23255537-26c8-4df3-8c53-bb9f27a186b6" containerID="b9dde3db5fec8e713f37d1f401b72666e3e2f0684d0db77725701d9835b6377f" exitCode=0 Jan 10 17:32:24 crc kubenswrapper[5036]: I0110 17:32:24.755258 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9xk8v/crc-debug-77rjq" event={"ID":"23255537-26c8-4df3-8c53-bb9f27a186b6","Type":"ContainerDied","Data":"b9dde3db5fec8e713f37d1f401b72666e3e2f0684d0db77725701d9835b6377f"} Jan 10 17:32:24 crc kubenswrapper[5036]: I0110 17:32:24.755489 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9xk8v/crc-debug-77rjq" event={"ID":"23255537-26c8-4df3-8c53-bb9f27a186b6","Type":"ContainerStarted","Data":"bf22e73c2fb314184f940176e4e07aafc945c39a8293aba5f21c90d7ce810b4c"} Jan 10 17:32:24 crc kubenswrapper[5036]: I0110 17:32:24.797963 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9xk8v/crc-debug-77rjq"] Jan 10 17:32:24 crc kubenswrapper[5036]: I0110 17:32:24.805328 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9xk8v/crc-debug-77rjq"] Jan 10 17:32:25 crc kubenswrapper[5036]: I0110 17:32:25.956189 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xk8v/crc-debug-77rjq" Jan 10 17:32:26 crc kubenswrapper[5036]: I0110 17:32:26.140100 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wvpb\" (UniqueName: \"kubernetes.io/projected/23255537-26c8-4df3-8c53-bb9f27a186b6-kube-api-access-2wvpb\") pod \"23255537-26c8-4df3-8c53-bb9f27a186b6\" (UID: \"23255537-26c8-4df3-8c53-bb9f27a186b6\") " Jan 10 17:32:26 crc kubenswrapper[5036]: I0110 17:32:26.140271 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23255537-26c8-4df3-8c53-bb9f27a186b6-host\") pod \"23255537-26c8-4df3-8c53-bb9f27a186b6\" (UID: \"23255537-26c8-4df3-8c53-bb9f27a186b6\") " Jan 10 17:32:26 crc kubenswrapper[5036]: I0110 17:32:26.140326 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23255537-26c8-4df3-8c53-bb9f27a186b6-host" (OuterVolumeSpecName: "host") pod "23255537-26c8-4df3-8c53-bb9f27a186b6" (UID: "23255537-26c8-4df3-8c53-bb9f27a186b6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 10 17:32:26 crc kubenswrapper[5036]: I0110 17:32:26.141035 5036 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/23255537-26c8-4df3-8c53-bb9f27a186b6-host\") on node \"crc\" DevicePath \"\"" Jan 10 17:32:26 crc kubenswrapper[5036]: I0110 17:32:26.145760 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23255537-26c8-4df3-8c53-bb9f27a186b6-kube-api-access-2wvpb" (OuterVolumeSpecName: "kube-api-access-2wvpb") pod "23255537-26c8-4df3-8c53-bb9f27a186b6" (UID: "23255537-26c8-4df3-8c53-bb9f27a186b6"). InnerVolumeSpecName "kube-api-access-2wvpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:32:26 crc kubenswrapper[5036]: I0110 17:32:26.244550 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wvpb\" (UniqueName: \"kubernetes.io/projected/23255537-26c8-4df3-8c53-bb9f27a186b6-kube-api-access-2wvpb\") on node \"crc\" DevicePath \"\"" Jan 10 17:32:26 crc kubenswrapper[5036]: I0110 17:32:26.518356 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23255537-26c8-4df3-8c53-bb9f27a186b6" path="/var/lib/kubelet/pods/23255537-26c8-4df3-8c53-bb9f27a186b6/volumes" Jan 10 17:32:26 crc kubenswrapper[5036]: I0110 17:32:26.773302 5036 scope.go:117] "RemoveContainer" containerID="b9dde3db5fec8e713f37d1f401b72666e3e2f0684d0db77725701d9835b6377f" Jan 10 17:32:26 crc kubenswrapper[5036]: I0110 17:32:26.773617 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xk8v/crc-debug-77rjq" Jan 10 17:32:34 crc kubenswrapper[5036]: I0110 17:32:34.508241 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" Jan 10 17:32:34 crc kubenswrapper[5036]: E0110 17:32:34.510556 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:32:45 crc kubenswrapper[5036]: I0110 17:32:45.508354 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" Jan 10 17:32:45 crc kubenswrapper[5036]: E0110 17:32:45.509163 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:33:00 crc kubenswrapper[5036]: I0110 17:33:00.508897 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" Jan 10 17:33:00 crc kubenswrapper[5036]: E0110 17:33:00.509938 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:33:11 crc kubenswrapper[5036]: I0110 17:33:11.508773 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" Jan 10 17:33:11 crc kubenswrapper[5036]: E0110 17:33:11.509372 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:33:21 crc kubenswrapper[5036]: I0110 17:33:21.736949 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7655587964-dzfxf_a96677c4-c2f0-4fba-bcb0-a657dfdd1f41/barbican-api/0.log" Jan 10 17:33:21 crc kubenswrapper[5036]: I0110 17:33:21.868239 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7655587964-dzfxf_a96677c4-c2f0-4fba-bcb0-a657dfdd1f41/barbican-api-log/0.log" Jan 10 17:33:21 crc kubenswrapper[5036]: I0110 17:33:21.912822 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-77cbb79454-h7btf_731670b8-d6af-49c5-b8cf-ddeafb2462c7/barbican-keystone-listener/0.log" Jan 10 17:33:22 crc kubenswrapper[5036]: I0110 17:33:22.035254 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-77cbb79454-h7btf_731670b8-d6af-49c5-b8cf-ddeafb2462c7/barbican-keystone-listener-log/0.log" Jan 10 17:33:22 crc kubenswrapper[5036]: I0110 17:33:22.102311 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-55c7665d4c-brkx9_608bfa08-ff8b-4f06-bc62-e456f9e2005c/barbican-worker/0.log" Jan 10 17:33:22 crc kubenswrapper[5036]: I0110 17:33:22.180911 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-55c7665d4c-brkx9_608bfa08-ff8b-4f06-bc62-e456f9e2005c/barbican-worker-log/0.log" Jan 10 17:33:22 crc kubenswrapper[5036]: I0110 17:33:22.311691 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-4qlrz_d9f0ccdb-1434-4bd0-90e1-d9314c8d716f/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:33:22 crc kubenswrapper[5036]: I0110 17:33:22.387963 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_eaeec74d-5c59-4684-81e3-7ca32b833f59/ceilometer-central-agent/0.log" Jan 10 17:33:22 crc kubenswrapper[5036]: I0110 17:33:22.462030 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_eaeec74d-5c59-4684-81e3-7ca32b833f59/ceilometer-notification-agent/0.log" Jan 10 17:33:22 crc kubenswrapper[5036]: I0110 17:33:22.548619 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_eaeec74d-5c59-4684-81e3-7ca32b833f59/proxy-httpd/0.log" Jan 10 17:33:22 crc kubenswrapper[5036]: I0110 17:33:22.560362 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_eaeec74d-5c59-4684-81e3-7ca32b833f59/sg-core/0.log" Jan 10 17:33:22 crc kubenswrapper[5036]: I0110 17:33:22.683589 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-fkcrr_35435ad9-1b59-46c6-b2c7-a57b43c65a3d/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:33:22 crc kubenswrapper[5036]: I0110 17:33:22.793381 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wpd77_993c9fcb-a10b-4d08-ae74-2bc52e9d8131/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:33:22 crc kubenswrapper[5036]: I0110 17:33:22.945361 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_04bfc371-7aba-4a4d-b018-4a79ad8a0b7b/cinder-api/0.log" Jan 10 17:33:22 crc kubenswrapper[5036]: I0110 17:33:22.985850 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_04bfc371-7aba-4a4d-b018-4a79ad8a0b7b/cinder-api-log/0.log" Jan 10 17:33:23 crc kubenswrapper[5036]: I0110 17:33:23.156212 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_72df80ca-b881-4bc6-b6bc-816dccb6a4a6/probe/0.log" Jan 10 17:33:23 crc kubenswrapper[5036]: I0110 17:33:23.372582 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_db9849cf-82c8-4f9d-86f2-c7bf664528c9/cinder-scheduler/0.log" Jan 10 17:33:23 crc kubenswrapper[5036]: I0110 17:33:23.385951 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_db9849cf-82c8-4f9d-86f2-c7bf664528c9/probe/0.log" Jan 10 17:33:23 crc kubenswrapper[5036]: I0110 17:33:23.543475 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_72df80ca-b881-4bc6-b6bc-816dccb6a4a6/cinder-backup/0.log" Jan 10 17:33:23 crc kubenswrapper[5036]: I0110 17:33:23.630502 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_5e51ea81-c177-4dc1-a427-c3290a9e6010/probe/0.log" Jan 10 17:33:23 crc kubenswrapper[5036]: I0110 17:33:23.721431 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_5e51ea81-c177-4dc1-a427-c3290a9e6010/cinder-volume/0.log" Jan 10 17:33:23 crc kubenswrapper[5036]: I0110 17:33:23.824610 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-qt7x4_feaba290-606b-4396-af62-f32fd6e33a53/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:33:23 crc kubenswrapper[5036]: I0110 17:33:23.907975 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-b4gng_bf597c03-b76a-445a-84d3-034d70ca102e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:33:24 crc kubenswrapper[5036]: I0110 17:33:24.083873 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-dd8k9_7186e5b3-1cc5-422b-8151-4a873bf08a6a/init/0.log" Jan 10 17:33:24 crc kubenswrapper[5036]: I0110 17:33:24.228839 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-dd8k9_7186e5b3-1cc5-422b-8151-4a873bf08a6a/init/0.log" Jan 10 17:33:24 crc kubenswrapper[5036]: I0110 17:33:24.305677 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_31000160-d620-481e-8b44-98f23e2e0679/glance-httpd/0.log" Jan 10 17:33:24 crc kubenswrapper[5036]: I0110 17:33:24.322379 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-dd8k9_7186e5b3-1cc5-422b-8151-4a873bf08a6a/dnsmasq-dns/0.log" Jan 10 17:33:24 crc kubenswrapper[5036]: I0110 17:33:24.501553 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_46236d51-28af-48ad-8aff-2300b9d0155f/glance-httpd/0.log" Jan 10 17:33:24 crc kubenswrapper[5036]: I0110 17:33:24.548661 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_31000160-d620-481e-8b44-98f23e2e0679/glance-log/0.log" Jan 10 17:33:24 crc kubenswrapper[5036]: I0110 17:33:24.574844 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_46236d51-28af-48ad-8aff-2300b9d0155f/glance-log/0.log" Jan 10 17:33:24 crc kubenswrapper[5036]: I0110 17:33:24.853757 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5bcc8455c4-njd4j_e92a2ceb-4619-4207-a2a3-b6c588674ab8/horizon/0.log" Jan 10 17:33:24 crc kubenswrapper[5036]: I0110 17:33:24.933503 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5bcc8455c4-njd4j_e92a2ceb-4619-4207-a2a3-b6c588674ab8/horizon-log/0.log" Jan 10 17:33:24 crc kubenswrapper[5036]: I0110 17:33:24.946083 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-qwmg9_421d37b9-14cd-4270-b305-c6f946cd32a3/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:33:25 crc kubenswrapper[5036]: I0110 17:33:25.048016 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-zzhbf_7be91e0f-1820-445f-b106-0558e046ac4a/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:33:25 crc kubenswrapper[5036]: I0110 17:33:25.199008 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6f7c9c789b-dj95d_75115cba-8c6e-4c48-b71c-0277c43f446c/keystone-api/0.log" Jan 10 17:33:25 crc kubenswrapper[5036]: I0110 17:33:25.214337 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29467741-znhmh_6bcb0a70-9f58-48f3-b35d-3adf490692cb/keystone-cron/0.log" Jan 10 17:33:25 crc kubenswrapper[5036]: I0110 17:33:25.366269 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_2c6502b1-879a-46ee-a2ff-54cece3ee9e6/kube-state-metrics/0.log" Jan 10 17:33:25 crc kubenswrapper[5036]: I0110 17:33:25.461804 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-vhjn6_b0c29b9c-0e82-4bbc-89af-fa26d3c4603b/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:33:25 crc kubenswrapper[5036]: I0110 17:33:25.554595 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_32551bcd-e5f3-445c-b4d2-d4ac138a54ce/manila-api-log/0.log" Jan 10 17:33:25 crc kubenswrapper[5036]: I0110 17:33:25.880791 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_32551bcd-e5f3-445c-b4d2-d4ac138a54ce/manila-api/0.log" Jan 10 17:33:25 crc kubenswrapper[5036]: I0110 17:33:25.948101 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_da772573-b489-4f28-85da-5d242835ae61/manila-scheduler/0.log" Jan 10 17:33:25 crc kubenswrapper[5036]: I0110 17:33:25.996550 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_da772573-b489-4f28-85da-5d242835ae61/probe/0.log" Jan 10 17:33:26 crc kubenswrapper[5036]: I0110 17:33:26.073354 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c/manila-share/0.log" Jan 10 17:33:26 crc kubenswrapper[5036]: I0110 17:33:26.092314 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_ecd8c0ec-8b91-46fb-9c5b-36c16d0e4c7c/probe/0.log" Jan 10 17:33:26 crc kubenswrapper[5036]: I0110 17:33:26.337028 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-74d5fd97c9-96pjx_ffb9a3a8-bbeb-414f-8d26-f35e51a05957/neutron-api/0.log" Jan 10 17:33:26 crc kubenswrapper[5036]: I0110 17:33:26.339936 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-74d5fd97c9-96pjx_ffb9a3a8-bbeb-414f-8d26-f35e51a05957/neutron-httpd/0.log" Jan 10 17:33:26 crc kubenswrapper[5036]: I0110 17:33:26.491012 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-4bc6x_3f111a6e-f987-4636-aada-aee2793d5047/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:33:26 crc kubenswrapper[5036]: I0110 17:33:26.509739 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" Jan 10 17:33:26 crc kubenswrapper[5036]: E0110 17:33:26.510223 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:33:26 crc kubenswrapper[5036]: I0110 17:33:26.802166 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3cf23453-9366-4458-9e7c-af60e7ef7b83/nova-api-log/0.log" Jan 10 17:33:26 crc kubenswrapper[5036]: I0110 17:33:26.920561 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_066ba36b-3da0-4db3-8f19-13e5a5227ab5/memcached/0.log" Jan 10 17:33:26 crc kubenswrapper[5036]: I0110 17:33:26.987957 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_e98e9d9c-f90a-44da-9b67-2dadaf5b24b3/nova-cell0-conductor-conductor/0.log" Jan 10 17:33:27 crc kubenswrapper[5036]: I0110 17:33:27.135737 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3cf23453-9366-4458-9e7c-af60e7ef7b83/nova-api-api/0.log" Jan 10 17:33:27 crc kubenswrapper[5036]: I0110 17:33:27.161467 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_1de05ac5-ff01-445f-b1a8-41a7db2a70c4/nova-cell1-conductor-conductor/0.log" Jan 10 17:33:27 crc kubenswrapper[5036]: I0110 17:33:27.260015 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_8c0ed0eb-87d3-43cc-bdbb-1269890e7799/nova-cell1-novncproxy-novncproxy/0.log" Jan 10 17:33:27 crc kubenswrapper[5036]: I0110 17:33:27.422595 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-tn6xl_b4da8068-8e5a-4624-b65f-05da63640d19/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:33:27 crc kubenswrapper[5036]: I0110 17:33:27.432007 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f2dade9a-7926-4c9b-82df-4c525efd69db/nova-metadata-log/0.log" Jan 10 17:33:27 crc kubenswrapper[5036]: I0110 17:33:27.652236 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2/mysql-bootstrap/0.log" Jan 10 17:33:27 crc kubenswrapper[5036]: I0110 17:33:27.761723 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_50d7fbd5-136f-4138-b4de-7d0841e80688/nova-scheduler-scheduler/0.log" Jan 10 17:33:27 crc kubenswrapper[5036]: I0110 17:33:27.849114 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2/mysql-bootstrap/0.log" Jan 10 17:33:27 crc kubenswrapper[5036]: I0110 17:33:27.860509 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_78b8c3a9-e6b8-4f1a-b0a4-5370e9e5e2f2/galera/0.log" Jan 10 17:33:28 crc kubenswrapper[5036]: I0110 17:33:28.027706 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3f624572-bbfe-4c9d-be6f-f8f647fd8aa2/mysql-bootstrap/0.log" Jan 10 17:33:28 crc kubenswrapper[5036]: I0110 17:33:28.213679 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3f624572-bbfe-4c9d-be6f-f8f647fd8aa2/galera/0.log" Jan 10 17:33:28 crc kubenswrapper[5036]: I0110 17:33:28.246978 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3f624572-bbfe-4c9d-be6f-f8f647fd8aa2/mysql-bootstrap/0.log" Jan 10 17:33:28 crc kubenswrapper[5036]: I0110 17:33:28.247941 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_70cfbefa-2928-4ca5-aa74-93fb1b4cd059/openstackclient/0.log" Jan 10 17:33:28 crc kubenswrapper[5036]: I0110 17:33:28.387218 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f2dade9a-7926-4c9b-82df-4c525efd69db/nova-metadata-metadata/0.log" Jan 10 17:33:28 crc kubenswrapper[5036]: I0110 17:33:28.473236 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-czqbw_be4f7b3d-ab10-498f-ac5a-9b37dafcd5f4/ovn-controller/0.log" Jan 10 17:33:28 crc kubenswrapper[5036]: I0110 17:33:28.513597 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-jp5mj_d0f482ce-10a1-42c2-80f6-60fd28c8cc25/openstack-network-exporter/0.log" Jan 10 17:33:28 crc kubenswrapper[5036]: I0110 17:33:28.604377 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vsd6b_65d28afa-c448-4c8a-8fe9-062d9383f484/ovsdb-server-init/0.log" Jan 10 17:33:28 crc kubenswrapper[5036]: I0110 17:33:28.767384 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vsd6b_65d28afa-c448-4c8a-8fe9-062d9383f484/ovs-vswitchd/0.log" Jan 10 17:33:28 crc kubenswrapper[5036]: I0110 17:33:28.787177 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vsd6b_65d28afa-c448-4c8a-8fe9-062d9383f484/ovsdb-server/0.log" Jan 10 17:33:28 crc kubenswrapper[5036]: I0110 17:33:28.828880 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-skrfj_7cb46990-94ee-4a82-93a2-a30c563f1146/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:33:28 crc kubenswrapper[5036]: I0110 17:33:28.835491 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vsd6b_65d28afa-c448-4c8a-8fe9-062d9383f484/ovsdb-server-init/0.log" Jan 10 17:33:28 crc kubenswrapper[5036]: I0110 17:33:28.958998 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1d1aa719-1166-4afe-8263-c771aa0a25da/openstack-network-exporter/0.log" Jan 10 17:33:29 crc kubenswrapper[5036]: I0110 17:33:29.015550 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1d1aa719-1166-4afe-8263-c771aa0a25da/ovn-northd/0.log" Jan 10 17:33:29 crc kubenswrapper[5036]: I0110 17:33:29.117893 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b642befa-dd18-4984-b74f-d3945ee06f7d/openstack-network-exporter/0.log" Jan 10 17:33:29 crc kubenswrapper[5036]: I0110 17:33:29.164000 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4f74eaf1-cd39-41dc-8c0a-170373e863e5/openstack-network-exporter/0.log" Jan 10 17:33:29 crc kubenswrapper[5036]: I0110 17:33:29.187655 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b642befa-dd18-4984-b74f-d3945ee06f7d/ovsdbserver-nb/0.log" Jan 10 17:33:29 crc kubenswrapper[5036]: I0110 17:33:29.269420 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4f74eaf1-cd39-41dc-8c0a-170373e863e5/ovsdbserver-sb/0.log" Jan 10 17:33:29 crc kubenswrapper[5036]: I0110 17:33:29.420552 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6ffbbc4bd-swcjc_5b379ab6-fc59-475f-909f-4f71e7184803/placement-api/0.log" Jan 10 17:33:29 crc kubenswrapper[5036]: I0110 17:33:29.438526 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6ffbbc4bd-swcjc_5b379ab6-fc59-475f-909f-4f71e7184803/placement-log/0.log" Jan 10 17:33:29 crc kubenswrapper[5036]: I0110 17:33:29.475773 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1/setup-container/0.log" Jan 10 17:33:29 crc kubenswrapper[5036]: I0110 17:33:29.620503 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1/setup-container/0.log" Jan 10 17:33:29 crc kubenswrapper[5036]: I0110 17:33:29.631461 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_debd7e7e-7e74-43b6-b3d1-70ae0ee20dd1/rabbitmq/0.log" Jan 10 17:33:29 crc kubenswrapper[5036]: I0110 17:33:29.634832 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e33d0131-d1d9-42cb-b772-7fe9835cee44/setup-container/0.log" Jan 10 17:33:29 crc kubenswrapper[5036]: I0110 17:33:29.809751 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e33d0131-d1d9-42cb-b772-7fe9835cee44/setup-container/0.log" Jan 10 17:33:29 crc kubenswrapper[5036]: I0110 17:33:29.831789 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e33d0131-d1d9-42cb-b772-7fe9835cee44/rabbitmq/0.log" Jan 10 17:33:29 crc kubenswrapper[5036]: I0110 17:33:29.880019 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-tkc9f_f551e3a3-cdf6-4fc6-8452-869afe1cef86/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:33:30 crc kubenswrapper[5036]: I0110 17:33:30.185400 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-j9frx_8e48f105-5183-4dd8-94d9-8a8636ca4c82/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:33:30 crc kubenswrapper[5036]: I0110 17:33:30.267599 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-mw76h_956e3be3-ef01-423c-a80d-1b6c517aee91/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:33:30 crc kubenswrapper[5036]: I0110 17:33:30.310965 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-jw6dr_35cc2e15-b6d3-419a-b719-1fcee66ce1b5/ssh-known-hosts-edpm-deployment/0.log" Jan 10 17:33:30 crc kubenswrapper[5036]: I0110 17:33:30.442443 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_d98e00b3-6224-462a-abe0-52e09ac44fb8/tempest-tests-tempest-tests-runner/0.log" Jan 10 17:33:30 crc kubenswrapper[5036]: I0110 17:33:30.493187 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_bbb64716-afc7-4c1a-be2f-2ff9cc886e96/test-operator-logs-container/0.log" Jan 10 17:33:30 crc kubenswrapper[5036]: I0110 17:33:30.662794 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-z6vdg_be9c4cc3-5744-42de-809a-fcd16a407199/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 10 17:33:38 crc kubenswrapper[5036]: I0110 17:33:38.516276 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" Jan 10 17:33:38 crc kubenswrapper[5036]: E0110 17:33:38.517164 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:33:51 crc kubenswrapper[5036]: I0110 17:33:51.508771 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" Jan 10 17:33:51 crc kubenswrapper[5036]: E0110 17:33:51.511533 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:33:53 crc kubenswrapper[5036]: I0110 17:33:53.172623 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx_b93cb83a-a272-4416-bff9-4da9aeb4f412/util/0.log" Jan 10 17:33:53 crc kubenswrapper[5036]: I0110 17:33:53.540435 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx_b93cb83a-a272-4416-bff9-4da9aeb4f412/util/0.log" Jan 10 17:33:53 crc kubenswrapper[5036]: I0110 17:33:53.547211 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx_b93cb83a-a272-4416-bff9-4da9aeb4f412/pull/0.log" Jan 10 17:33:53 crc kubenswrapper[5036]: I0110 17:33:53.560652 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx_b93cb83a-a272-4416-bff9-4da9aeb4f412/pull/0.log" Jan 10 17:33:53 crc kubenswrapper[5036]: I0110 17:33:53.729983 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx_b93cb83a-a272-4416-bff9-4da9aeb4f412/pull/0.log" Jan 10 17:33:53 crc kubenswrapper[5036]: I0110 17:33:53.740314 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx_b93cb83a-a272-4416-bff9-4da9aeb4f412/util/0.log" Jan 10 17:33:53 crc kubenswrapper[5036]: I0110 17:33:53.750276 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_55e91577fa14603399621ff76b53b515e4a4f417dea2ccb9c8354c794ehfqpx_b93cb83a-a272-4416-bff9-4da9aeb4f412/extract/0.log" Jan 10 17:33:53 crc kubenswrapper[5036]: I0110 17:33:53.974623 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-78979fc445-2qq47_f1b7f315-826c-4a66-9919-69b3c75a648e/manager/0.log" Jan 10 17:33:53 crc kubenswrapper[5036]: I0110 17:33:53.991312 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-678b8c6d96-568pc_a17f3d4e-41a9-4941-83f6-090808b6cb29/manager/0.log" Jan 10 17:33:54 crc kubenswrapper[5036]: I0110 17:33:54.242312 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5967c8645c-cbdjv_09239a1e-ce39-49e7-a532-f7c353022176/manager/0.log" Jan 10 17:33:54 crc kubenswrapper[5036]: I0110 17:33:54.251548 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66f8b87655-trzdf_52b19fea-05ac-4448-9446-33fbee11b2da/manager/0.log" Jan 10 17:33:54 crc kubenswrapper[5036]: I0110 17:33:54.414639 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-65c54c675d-ng9ld_ecf84720-507a-4a26-8326-7ed56754871e/manager/0.log" Jan 10 17:33:54 crc kubenswrapper[5036]: I0110 17:33:54.427294 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7998b4cc7b-bjxnm_58ba757d-493c-4a4c-9aaa-a3178272b7cb/manager/0.log" Jan 10 17:33:54 crc kubenswrapper[5036]: I0110 17:33:54.622279 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5b47c74dd5-skh8x_b4905be6-774a-4952-b195-f755688c7b26/manager/0.log" Jan 10 17:33:54 crc kubenswrapper[5036]: I0110 17:33:54.867105 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-568985c78-cs2b2_611b3f4f-0b6d-4ef9-b040-eba991c4bfe4/manager/0.log" Jan 10 17:33:54 crc kubenswrapper[5036]: I0110 17:33:54.885064 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-xcmds_80ddf12b-ee61-4d6f-a3fb-ff9aded793d7/manager/0.log" Jan 10 17:33:54 crc kubenswrapper[5036]: I0110 17:33:54.986470 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-598945d5b8-l8ggv_552c1d94-e289-46e0-8756-58982a7cdc4c/manager/0.log" Jan 10 17:33:55 crc kubenswrapper[5036]: I0110 17:33:55.126053 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-746ccdd857-kkjhp_0a3b9993-b2fb-4dda-952a-413cd5a3e01a/manager/0.log" Jan 10 17:33:55 crc kubenswrapper[5036]: I0110 17:33:55.182671 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7cd87b778f-4t295_6414be0b-ef34-4c95-9e31-4124dcad6cc4/manager/0.log" Jan 10 17:33:55 crc kubenswrapper[5036]: I0110 17:33:55.410199 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-68c649d9d-wv445_254c9f2b-ef77-4fbc-9884-c14caa297876/manager/0.log" Jan 10 17:33:55 crc kubenswrapper[5036]: I0110 17:33:55.444052 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5fbbf8b6cc-t7qtf_506aa4ca-31bb-48da-94b5-9ab7b43aea96/manager/0.log" Jan 10 17:33:55 crc kubenswrapper[5036]: I0110 17:33:55.559544 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5b4889549f2j7sh_f7c6aeaf-94ec-4558-8ec7-b4fd144a49b1/manager/0.log" Jan 10 17:33:55 crc kubenswrapper[5036]: I0110 17:33:55.950003 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-sdc2j_3929902d-323d-44ec-84be-4069e262618f/registry-server/0.log" Jan 10 17:33:56 crc kubenswrapper[5036]: I0110 17:33:56.013377 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5d4cd6578d-pt5gl_0ab4dccd-a4ff-49f6-96bf-a7150425ff15/operator/0.log" Jan 10 17:33:56 crc kubenswrapper[5036]: I0110 17:33:56.186852 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bf6d4f946-zz7v2_cf6aa765-9fbf-429d-83c1-db4671e7600c/manager/0.log" Jan 10 17:33:56 crc kubenswrapper[5036]: I0110 17:33:56.306348 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-84587ffc8-l7b7s_6283e4f6-c60e-4bff-b622-181c4abbc8a6/manager/0.log" Jan 10 17:33:56 crc kubenswrapper[5036]: I0110 17:33:56.507141 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-gcjhz_4ddc3dbc-f7b1-4627-9740-9e2f5c0296fd/operator/0.log" Jan 10 17:33:56 crc kubenswrapper[5036]: I0110 17:33:56.671416 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bb586bbf4-tn7cg_2e9ebb80-028a-43ac-b9cb-379dd1eda24e/manager/0.log" Jan 10 17:33:56 crc kubenswrapper[5036]: I0110 17:33:56.878625 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-68d988df55-88zlb_2c21d679-225e-4c33-8920-06a85ae163b6/manager/0.log" Jan 10 17:33:56 crc kubenswrapper[5036]: I0110 17:33:56.898581 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-56458c9ddd-p4bsn_de8e8f66-6d85-43d5-94a4-613fb3bfc53b/manager/0.log" Jan 10 17:33:56 crc kubenswrapper[5036]: I0110 17:33:56.942078 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6c866cfdcb-nbdkb_f3046ad8-aadd-4883-82b9-a794ddce82b9/manager/0.log" Jan 10 17:33:57 crc kubenswrapper[5036]: I0110 17:33:57.052644 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-9dbdf6486-fwmft_7d8099e2-6cd1-4ce8-b78b-0b51a4fedf42/manager/0.log" Jan 10 17:34:00 crc kubenswrapper[5036]: I0110 17:34:00.238012 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-czh8g"] Jan 10 17:34:00 crc kubenswrapper[5036]: E0110 17:34:00.238607 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23255537-26c8-4df3-8c53-bb9f27a186b6" containerName="container-00" Jan 10 17:34:00 crc kubenswrapper[5036]: I0110 17:34:00.238619 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="23255537-26c8-4df3-8c53-bb9f27a186b6" containerName="container-00" Jan 10 17:34:00 crc kubenswrapper[5036]: I0110 17:34:00.238807 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="23255537-26c8-4df3-8c53-bb9f27a186b6" containerName="container-00" Jan 10 17:34:00 crc kubenswrapper[5036]: I0110 17:34:00.240041 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-czh8g" Jan 10 17:34:00 crc kubenswrapper[5036]: I0110 17:34:00.256888 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-czh8g"] Jan 10 17:34:00 crc kubenswrapper[5036]: I0110 17:34:00.258045 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35394862-a188-4103-ad03-16570809e4e5-catalog-content\") pod \"redhat-operators-czh8g\" (UID: \"35394862-a188-4103-ad03-16570809e4e5\") " pod="openshift-marketplace/redhat-operators-czh8g" Jan 10 17:34:00 crc kubenswrapper[5036]: I0110 17:34:00.258126 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk6fb\" (UniqueName: \"kubernetes.io/projected/35394862-a188-4103-ad03-16570809e4e5-kube-api-access-vk6fb\") pod \"redhat-operators-czh8g\" (UID: \"35394862-a188-4103-ad03-16570809e4e5\") " pod="openshift-marketplace/redhat-operators-czh8g" Jan 10 17:34:00 crc kubenswrapper[5036]: I0110 17:34:00.258160 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35394862-a188-4103-ad03-16570809e4e5-utilities\") pod \"redhat-operators-czh8g\" (UID: \"35394862-a188-4103-ad03-16570809e4e5\") " pod="openshift-marketplace/redhat-operators-czh8g" Jan 10 17:34:00 crc kubenswrapper[5036]: I0110 17:34:00.359586 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk6fb\" (UniqueName: \"kubernetes.io/projected/35394862-a188-4103-ad03-16570809e4e5-kube-api-access-vk6fb\") pod \"redhat-operators-czh8g\" (UID: \"35394862-a188-4103-ad03-16570809e4e5\") " pod="openshift-marketplace/redhat-operators-czh8g" Jan 10 17:34:00 crc kubenswrapper[5036]: I0110 17:34:00.359786 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35394862-a188-4103-ad03-16570809e4e5-utilities\") pod \"redhat-operators-czh8g\" (UID: \"35394862-a188-4103-ad03-16570809e4e5\") " pod="openshift-marketplace/redhat-operators-czh8g" Jan 10 17:34:00 crc kubenswrapper[5036]: I0110 17:34:00.360142 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35394862-a188-4103-ad03-16570809e4e5-catalog-content\") pod \"redhat-operators-czh8g\" (UID: \"35394862-a188-4103-ad03-16570809e4e5\") " pod="openshift-marketplace/redhat-operators-czh8g" Jan 10 17:34:00 crc kubenswrapper[5036]: I0110 17:34:00.360305 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35394862-a188-4103-ad03-16570809e4e5-utilities\") pod \"redhat-operators-czh8g\" (UID: \"35394862-a188-4103-ad03-16570809e4e5\") " pod="openshift-marketplace/redhat-operators-czh8g" Jan 10 17:34:00 crc kubenswrapper[5036]: I0110 17:34:00.360644 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35394862-a188-4103-ad03-16570809e4e5-catalog-content\") pod \"redhat-operators-czh8g\" (UID: \"35394862-a188-4103-ad03-16570809e4e5\") " pod="openshift-marketplace/redhat-operators-czh8g" Jan 10 17:34:00 crc kubenswrapper[5036]: I0110 17:34:00.386458 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk6fb\" (UniqueName: \"kubernetes.io/projected/35394862-a188-4103-ad03-16570809e4e5-kube-api-access-vk6fb\") pod \"redhat-operators-czh8g\" (UID: \"35394862-a188-4103-ad03-16570809e4e5\") " pod="openshift-marketplace/redhat-operators-czh8g" Jan 10 17:34:00 crc kubenswrapper[5036]: I0110 17:34:00.560167 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-czh8g" Jan 10 17:34:01 crc kubenswrapper[5036]: I0110 17:34:01.072159 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-czh8g"] Jan 10 17:34:01 crc kubenswrapper[5036]: I0110 17:34:01.626609 5036 generic.go:334] "Generic (PLEG): container finished" podID="35394862-a188-4103-ad03-16570809e4e5" containerID="d8c42c8b49d5e5c90c3189b23d8f8be6fdc8f040322f7223226e693a75454faa" exitCode=0 Jan 10 17:34:01 crc kubenswrapper[5036]: I0110 17:34:01.626659 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czh8g" event={"ID":"35394862-a188-4103-ad03-16570809e4e5","Type":"ContainerDied","Data":"d8c42c8b49d5e5c90c3189b23d8f8be6fdc8f040322f7223226e693a75454faa"} Jan 10 17:34:01 crc kubenswrapper[5036]: I0110 17:34:01.626705 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czh8g" event={"ID":"35394862-a188-4103-ad03-16570809e4e5","Type":"ContainerStarted","Data":"f114c6e42cfb77b66a8480e460ff1096c55c1fcf092e50132ff4dd40f0435da7"} Jan 10 17:34:03 crc kubenswrapper[5036]: I0110 17:34:03.643996 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czh8g" event={"ID":"35394862-a188-4103-ad03-16570809e4e5","Type":"ContainerStarted","Data":"f61d6baaad63d5ada89d83d61342cfe59e298d52a8348104188f73caaa9f67b5"} Jan 10 17:34:05 crc kubenswrapper[5036]: I0110 17:34:05.662720 5036 generic.go:334] "Generic (PLEG): container finished" podID="35394862-a188-4103-ad03-16570809e4e5" containerID="f61d6baaad63d5ada89d83d61342cfe59e298d52a8348104188f73caaa9f67b5" exitCode=0 Jan 10 17:34:05 crc kubenswrapper[5036]: I0110 17:34:05.662801 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czh8g" event={"ID":"35394862-a188-4103-ad03-16570809e4e5","Type":"ContainerDied","Data":"f61d6baaad63d5ada89d83d61342cfe59e298d52a8348104188f73caaa9f67b5"} Jan 10 17:34:06 crc kubenswrapper[5036]: I0110 17:34:06.508865 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" Jan 10 17:34:06 crc kubenswrapper[5036]: E0110 17:34:06.509488 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:34:06 crc kubenswrapper[5036]: I0110 17:34:06.678832 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czh8g" event={"ID":"35394862-a188-4103-ad03-16570809e4e5","Type":"ContainerStarted","Data":"313f3c6d8923cf9c112db39c8c64e72ea3f7f60a701ea11e6fb698a8cd2cf39f"} Jan 10 17:34:06 crc kubenswrapper[5036]: I0110 17:34:06.701233 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-czh8g" podStartSLOduration=2.048123419 podStartE2EDuration="6.701209597s" podCreationTimestamp="2026-01-10 17:34:00 +0000 UTC" firstStartedPulling="2026-01-10 17:34:01.629878273 +0000 UTC m=+3963.500113787" lastFinishedPulling="2026-01-10 17:34:06.282964471 +0000 UTC m=+3968.153199965" observedRunningTime="2026-01-10 17:34:06.694215759 +0000 UTC m=+3968.564451283" watchObservedRunningTime="2026-01-10 17:34:06.701209597 +0000 UTC m=+3968.571445131" Jan 10 17:34:10 crc kubenswrapper[5036]: I0110 17:34:10.561284 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-czh8g" Jan 10 17:34:10 crc kubenswrapper[5036]: I0110 17:34:10.561897 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-czh8g" Jan 10 17:34:11 crc kubenswrapper[5036]: I0110 17:34:11.607194 5036 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-czh8g" podUID="35394862-a188-4103-ad03-16570809e4e5" containerName="registry-server" probeResult="failure" output=< Jan 10 17:34:11 crc kubenswrapper[5036]: timeout: failed to connect service ":50051" within 1s Jan 10 17:34:11 crc kubenswrapper[5036]: > Jan 10 17:34:17 crc kubenswrapper[5036]: I0110 17:34:17.076137 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-ptztt_acb54813-4c4d-4b94-9337-19541ac1980e/control-plane-machine-set-operator/0.log" Jan 10 17:34:17 crc kubenswrapper[5036]: I0110 17:34:17.250634 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-45j5v_6b14e5d5-1b40-45f6-a5c6-c161eeade0f9/kube-rbac-proxy/0.log" Jan 10 17:34:17 crc kubenswrapper[5036]: I0110 17:34:17.286758 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-45j5v_6b14e5d5-1b40-45f6-a5c6-c161eeade0f9/machine-api-operator/0.log" Jan 10 17:34:20 crc kubenswrapper[5036]: I0110 17:34:20.508947 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" Jan 10 17:34:20 crc kubenswrapper[5036]: E0110 17:34:20.509837 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:34:20 crc kubenswrapper[5036]: I0110 17:34:20.616928 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-czh8g" Jan 10 17:34:20 crc kubenswrapper[5036]: I0110 17:34:20.669636 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-czh8g" Jan 10 17:34:20 crc kubenswrapper[5036]: I0110 17:34:20.859555 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-czh8g"] Jan 10 17:34:21 crc kubenswrapper[5036]: I0110 17:34:21.830484 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-czh8g" podUID="35394862-a188-4103-ad03-16570809e4e5" containerName="registry-server" containerID="cri-o://313f3c6d8923cf9c112db39c8c64e72ea3f7f60a701ea11e6fb698a8cd2cf39f" gracePeriod=2 Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.379835 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-czh8g" Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.458715 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35394862-a188-4103-ad03-16570809e4e5-utilities\") pod \"35394862-a188-4103-ad03-16570809e4e5\" (UID: \"35394862-a188-4103-ad03-16570809e4e5\") " Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.458795 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35394862-a188-4103-ad03-16570809e4e5-catalog-content\") pod \"35394862-a188-4103-ad03-16570809e4e5\" (UID: \"35394862-a188-4103-ad03-16570809e4e5\") " Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.458922 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk6fb\" (UniqueName: \"kubernetes.io/projected/35394862-a188-4103-ad03-16570809e4e5-kube-api-access-vk6fb\") pod \"35394862-a188-4103-ad03-16570809e4e5\" (UID: \"35394862-a188-4103-ad03-16570809e4e5\") " Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.459753 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35394862-a188-4103-ad03-16570809e4e5-utilities" (OuterVolumeSpecName: "utilities") pod "35394862-a188-4103-ad03-16570809e4e5" (UID: "35394862-a188-4103-ad03-16570809e4e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.469161 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35394862-a188-4103-ad03-16570809e4e5-kube-api-access-vk6fb" (OuterVolumeSpecName: "kube-api-access-vk6fb") pod "35394862-a188-4103-ad03-16570809e4e5" (UID: "35394862-a188-4103-ad03-16570809e4e5"). InnerVolumeSpecName "kube-api-access-vk6fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.561048 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35394862-a188-4103-ad03-16570809e4e5-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.561085 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk6fb\" (UniqueName: \"kubernetes.io/projected/35394862-a188-4103-ad03-16570809e4e5-kube-api-access-vk6fb\") on node \"crc\" DevicePath \"\"" Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.602954 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35394862-a188-4103-ad03-16570809e4e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35394862-a188-4103-ad03-16570809e4e5" (UID: "35394862-a188-4103-ad03-16570809e4e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.662764 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35394862-a188-4103-ad03-16570809e4e5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.842839 5036 generic.go:334] "Generic (PLEG): container finished" podID="35394862-a188-4103-ad03-16570809e4e5" containerID="313f3c6d8923cf9c112db39c8c64e72ea3f7f60a701ea11e6fb698a8cd2cf39f" exitCode=0 Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.842913 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-czh8g" Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.842939 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czh8g" event={"ID":"35394862-a188-4103-ad03-16570809e4e5","Type":"ContainerDied","Data":"313f3c6d8923cf9c112db39c8c64e72ea3f7f60a701ea11e6fb698a8cd2cf39f"} Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.843289 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-czh8g" event={"ID":"35394862-a188-4103-ad03-16570809e4e5","Type":"ContainerDied","Data":"f114c6e42cfb77b66a8480e460ff1096c55c1fcf092e50132ff4dd40f0435da7"} Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.843317 5036 scope.go:117] "RemoveContainer" containerID="313f3c6d8923cf9c112db39c8c64e72ea3f7f60a701ea11e6fb698a8cd2cf39f" Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.873094 5036 scope.go:117] "RemoveContainer" containerID="f61d6baaad63d5ada89d83d61342cfe59e298d52a8348104188f73caaa9f67b5" Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.876906 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-czh8g"] Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.885378 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-czh8g"] Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.897705 5036 scope.go:117] "RemoveContainer" containerID="d8c42c8b49d5e5c90c3189b23d8f8be6fdc8f040322f7223226e693a75454faa" Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.941902 5036 scope.go:117] "RemoveContainer" containerID="313f3c6d8923cf9c112db39c8c64e72ea3f7f60a701ea11e6fb698a8cd2cf39f" Jan 10 17:34:22 crc kubenswrapper[5036]: E0110 17:34:22.942395 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"313f3c6d8923cf9c112db39c8c64e72ea3f7f60a701ea11e6fb698a8cd2cf39f\": container with ID starting with 313f3c6d8923cf9c112db39c8c64e72ea3f7f60a701ea11e6fb698a8cd2cf39f not found: ID does not exist" containerID="313f3c6d8923cf9c112db39c8c64e72ea3f7f60a701ea11e6fb698a8cd2cf39f" Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.942444 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"313f3c6d8923cf9c112db39c8c64e72ea3f7f60a701ea11e6fb698a8cd2cf39f"} err="failed to get container status \"313f3c6d8923cf9c112db39c8c64e72ea3f7f60a701ea11e6fb698a8cd2cf39f\": rpc error: code = NotFound desc = could not find container \"313f3c6d8923cf9c112db39c8c64e72ea3f7f60a701ea11e6fb698a8cd2cf39f\": container with ID starting with 313f3c6d8923cf9c112db39c8c64e72ea3f7f60a701ea11e6fb698a8cd2cf39f not found: ID does not exist" Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.942487 5036 scope.go:117] "RemoveContainer" containerID="f61d6baaad63d5ada89d83d61342cfe59e298d52a8348104188f73caaa9f67b5" Jan 10 17:34:22 crc kubenswrapper[5036]: E0110 17:34:22.943029 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f61d6baaad63d5ada89d83d61342cfe59e298d52a8348104188f73caaa9f67b5\": container with ID starting with f61d6baaad63d5ada89d83d61342cfe59e298d52a8348104188f73caaa9f67b5 not found: ID does not exist" containerID="f61d6baaad63d5ada89d83d61342cfe59e298d52a8348104188f73caaa9f67b5" Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.943059 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f61d6baaad63d5ada89d83d61342cfe59e298d52a8348104188f73caaa9f67b5"} err="failed to get container status \"f61d6baaad63d5ada89d83d61342cfe59e298d52a8348104188f73caaa9f67b5\": rpc error: code = NotFound desc = could not find container \"f61d6baaad63d5ada89d83d61342cfe59e298d52a8348104188f73caaa9f67b5\": container with ID starting with f61d6baaad63d5ada89d83d61342cfe59e298d52a8348104188f73caaa9f67b5 not found: ID does not exist" Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.943081 5036 scope.go:117] "RemoveContainer" containerID="d8c42c8b49d5e5c90c3189b23d8f8be6fdc8f040322f7223226e693a75454faa" Jan 10 17:34:22 crc kubenswrapper[5036]: E0110 17:34:22.943336 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8c42c8b49d5e5c90c3189b23d8f8be6fdc8f040322f7223226e693a75454faa\": container with ID starting with d8c42c8b49d5e5c90c3189b23d8f8be6fdc8f040322f7223226e693a75454faa not found: ID does not exist" containerID="d8c42c8b49d5e5c90c3189b23d8f8be6fdc8f040322f7223226e693a75454faa" Jan 10 17:34:22 crc kubenswrapper[5036]: I0110 17:34:22.943351 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8c42c8b49d5e5c90c3189b23d8f8be6fdc8f040322f7223226e693a75454faa"} err="failed to get container status \"d8c42c8b49d5e5c90c3189b23d8f8be6fdc8f040322f7223226e693a75454faa\": rpc error: code = NotFound desc = could not find container \"d8c42c8b49d5e5c90c3189b23d8f8be6fdc8f040322f7223226e693a75454faa\": container with ID starting with d8c42c8b49d5e5c90c3189b23d8f8be6fdc8f040322f7223226e693a75454faa not found: ID does not exist" Jan 10 17:34:24 crc kubenswrapper[5036]: I0110 17:34:24.526573 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35394862-a188-4103-ad03-16570809e4e5" path="/var/lib/kubelet/pods/35394862-a188-4103-ad03-16570809e4e5/volumes" Jan 10 17:34:31 crc kubenswrapper[5036]: I0110 17:34:31.110963 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-hrdzf_ef12a866-7983-4859-8d00-6ba6ed292af3/cert-manager-controller/0.log" Jan 10 17:34:31 crc kubenswrapper[5036]: I0110 17:34:31.229238 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-b7rxm_2574f8f4-e56e-4d7e-b181-5e01d69b1485/cert-manager-cainjector/0.log" Jan 10 17:34:31 crc kubenswrapper[5036]: I0110 17:34:31.397387 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-pgpxj_b15af209-c459-40f3-affc-0d5a3d2b031d/cert-manager-webhook/0.log" Jan 10 17:34:34 crc kubenswrapper[5036]: I0110 17:34:34.507806 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" Jan 10 17:34:34 crc kubenswrapper[5036]: E0110 17:34:34.508305 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:34:45 crc kubenswrapper[5036]: I0110 17:34:45.603500 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6ff7998486-577fv_31225a12-4366-4224-9bb5-3c8ee635a631/nmstate-console-plugin/0.log" Jan 10 17:34:45 crc kubenswrapper[5036]: I0110 17:34:45.784037 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-7r9qs_5abcf259-63b1-44b5-b335-950b101edec4/nmstate-handler/0.log" Jan 10 17:34:45 crc kubenswrapper[5036]: I0110 17:34:45.808407 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-f826c_d921a9df-835d-4165-ac39-8717cfcf384d/kube-rbac-proxy/0.log" Jan 10 17:34:45 crc kubenswrapper[5036]: I0110 17:34:45.824205 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-f826c_d921a9df-835d-4165-ac39-8717cfcf384d/nmstate-metrics/0.log" Jan 10 17:34:45 crc kubenswrapper[5036]: I0110 17:34:45.977652 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-6769fb99d-bnf7g_d8791ab9-ee3b-4af7-98d5-2bc06f5d863a/nmstate-operator/0.log" Jan 10 17:34:46 crc kubenswrapper[5036]: I0110 17:34:46.014592 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-f8fb84555-cdrlk_e0f63dbf-f65f-4d9a-8cf4-802a41ed012b/nmstate-webhook/0.log" Jan 10 17:34:49 crc kubenswrapper[5036]: I0110 17:34:49.508598 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" Jan 10 17:34:49 crc kubenswrapper[5036]: E0110 17:34:49.509226 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:35:02 crc kubenswrapper[5036]: I0110 17:35:02.508552 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" Jan 10 17:35:02 crc kubenswrapper[5036]: E0110 17:35:02.509481 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:35:08 crc kubenswrapper[5036]: I0110 17:35:08.131360 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-skjhf"] Jan 10 17:35:08 crc kubenswrapper[5036]: E0110 17:35:08.132254 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35394862-a188-4103-ad03-16570809e4e5" containerName="extract-content" Jan 10 17:35:08 crc kubenswrapper[5036]: I0110 17:35:08.132269 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="35394862-a188-4103-ad03-16570809e4e5" containerName="extract-content" Jan 10 17:35:08 crc kubenswrapper[5036]: E0110 17:35:08.132281 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35394862-a188-4103-ad03-16570809e4e5" containerName="extract-utilities" Jan 10 17:35:08 crc kubenswrapper[5036]: I0110 17:35:08.132287 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="35394862-a188-4103-ad03-16570809e4e5" containerName="extract-utilities" Jan 10 17:35:08 crc kubenswrapper[5036]: E0110 17:35:08.132307 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35394862-a188-4103-ad03-16570809e4e5" containerName="registry-server" Jan 10 17:35:08 crc kubenswrapper[5036]: I0110 17:35:08.132314 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="35394862-a188-4103-ad03-16570809e4e5" containerName="registry-server" Jan 10 17:35:08 crc kubenswrapper[5036]: I0110 17:35:08.132519 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="35394862-a188-4103-ad03-16570809e4e5" containerName="registry-server" Jan 10 17:35:08 crc kubenswrapper[5036]: I0110 17:35:08.134061 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-skjhf" Jan 10 17:35:08 crc kubenswrapper[5036]: I0110 17:35:08.158737 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-skjhf"] Jan 10 17:35:08 crc kubenswrapper[5036]: I0110 17:35:08.266586 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j446\" (UniqueName: \"kubernetes.io/projected/41498079-4e4d-412f-b877-624e5473b06d-kube-api-access-5j446\") pod \"redhat-marketplace-skjhf\" (UID: \"41498079-4e4d-412f-b877-624e5473b06d\") " pod="openshift-marketplace/redhat-marketplace-skjhf" Jan 10 17:35:08 crc kubenswrapper[5036]: I0110 17:35:08.266738 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41498079-4e4d-412f-b877-624e5473b06d-catalog-content\") pod \"redhat-marketplace-skjhf\" (UID: \"41498079-4e4d-412f-b877-624e5473b06d\") " pod="openshift-marketplace/redhat-marketplace-skjhf" Jan 10 17:35:08 crc kubenswrapper[5036]: I0110 17:35:08.266917 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41498079-4e4d-412f-b877-624e5473b06d-utilities\") pod \"redhat-marketplace-skjhf\" (UID: \"41498079-4e4d-412f-b877-624e5473b06d\") " pod="openshift-marketplace/redhat-marketplace-skjhf" Jan 10 17:35:08 crc kubenswrapper[5036]: I0110 17:35:08.369416 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41498079-4e4d-412f-b877-624e5473b06d-utilities\") pod \"redhat-marketplace-skjhf\" (UID: \"41498079-4e4d-412f-b877-624e5473b06d\") " pod="openshift-marketplace/redhat-marketplace-skjhf" Jan 10 17:35:08 crc kubenswrapper[5036]: I0110 17:35:08.369500 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j446\" (UniqueName: \"kubernetes.io/projected/41498079-4e4d-412f-b877-624e5473b06d-kube-api-access-5j446\") pod \"redhat-marketplace-skjhf\" (UID: \"41498079-4e4d-412f-b877-624e5473b06d\") " pod="openshift-marketplace/redhat-marketplace-skjhf" Jan 10 17:35:08 crc kubenswrapper[5036]: I0110 17:35:08.369554 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41498079-4e4d-412f-b877-624e5473b06d-catalog-content\") pod \"redhat-marketplace-skjhf\" (UID: \"41498079-4e4d-412f-b877-624e5473b06d\") " pod="openshift-marketplace/redhat-marketplace-skjhf" Jan 10 17:35:08 crc kubenswrapper[5036]: I0110 17:35:08.370340 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41498079-4e4d-412f-b877-624e5473b06d-catalog-content\") pod \"redhat-marketplace-skjhf\" (UID: \"41498079-4e4d-412f-b877-624e5473b06d\") " pod="openshift-marketplace/redhat-marketplace-skjhf" Jan 10 17:35:08 crc kubenswrapper[5036]: I0110 17:35:08.370840 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41498079-4e4d-412f-b877-624e5473b06d-utilities\") pod \"redhat-marketplace-skjhf\" (UID: \"41498079-4e4d-412f-b877-624e5473b06d\") " pod="openshift-marketplace/redhat-marketplace-skjhf" Jan 10 17:35:08 crc kubenswrapper[5036]: I0110 17:35:08.394032 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j446\" (UniqueName: \"kubernetes.io/projected/41498079-4e4d-412f-b877-624e5473b06d-kube-api-access-5j446\") pod \"redhat-marketplace-skjhf\" (UID: \"41498079-4e4d-412f-b877-624e5473b06d\") " pod="openshift-marketplace/redhat-marketplace-skjhf" Jan 10 17:35:08 crc kubenswrapper[5036]: I0110 17:35:08.464602 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-skjhf" Jan 10 17:35:08 crc kubenswrapper[5036]: I0110 17:35:08.975189 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-skjhf"] Jan 10 17:35:09 crc kubenswrapper[5036]: I0110 17:35:09.330925 5036 generic.go:334] "Generic (PLEG): container finished" podID="41498079-4e4d-412f-b877-624e5473b06d" containerID="22b977913e0ca96cfafc1d9e7ba805366866ba17a32f9eca52bf46e9e1cc702d" exitCode=0 Jan 10 17:35:09 crc kubenswrapper[5036]: I0110 17:35:09.331043 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skjhf" event={"ID":"41498079-4e4d-412f-b877-624e5473b06d","Type":"ContainerDied","Data":"22b977913e0ca96cfafc1d9e7ba805366866ba17a32f9eca52bf46e9e1cc702d"} Jan 10 17:35:09 crc kubenswrapper[5036]: I0110 17:35:09.331226 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skjhf" event={"ID":"41498079-4e4d-412f-b877-624e5473b06d","Type":"ContainerStarted","Data":"2a73a94a85d7897e7c224b38bf9abe3e33b92681f2850e2a50b4c0bbad81ab01"} Jan 10 17:35:11 crc kubenswrapper[5036]: I0110 17:35:11.350953 5036 generic.go:334] "Generic (PLEG): container finished" podID="41498079-4e4d-412f-b877-624e5473b06d" containerID="1e2f0bc518589788503cedc834e15c31db03cfaf7ca6a54f133afd7abf06f161" exitCode=0 Jan 10 17:35:11 crc kubenswrapper[5036]: I0110 17:35:11.351028 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skjhf" event={"ID":"41498079-4e4d-412f-b877-624e5473b06d","Type":"ContainerDied","Data":"1e2f0bc518589788503cedc834e15c31db03cfaf7ca6a54f133afd7abf06f161"} Jan 10 17:35:12 crc kubenswrapper[5036]: I0110 17:35:12.363913 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skjhf" event={"ID":"41498079-4e4d-412f-b877-624e5473b06d","Type":"ContainerStarted","Data":"89133d3a62b280b8b769fffc8ecec0ff63e3d7daf87db6d2932dc2470d4b4f43"} Jan 10 17:35:12 crc kubenswrapper[5036]: I0110 17:35:12.396403 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-skjhf" podStartSLOduration=1.9808046940000001 podStartE2EDuration="4.39637642s" podCreationTimestamp="2026-01-10 17:35:08 +0000 UTC" firstStartedPulling="2026-01-10 17:35:09.332800824 +0000 UTC m=+4031.203036318" lastFinishedPulling="2026-01-10 17:35:11.74837255 +0000 UTC m=+4033.618608044" observedRunningTime="2026-01-10 17:35:12.384858524 +0000 UTC m=+4034.255094028" watchObservedRunningTime="2026-01-10 17:35:12.39637642 +0000 UTC m=+4034.266611934" Jan 10 17:35:14 crc kubenswrapper[5036]: I0110 17:35:14.508630 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" Jan 10 17:35:14 crc kubenswrapper[5036]: E0110 17:35:14.509295 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:35:15 crc kubenswrapper[5036]: I0110 17:35:15.352317 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-5xsxl_fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69/kube-rbac-proxy/0.log" Jan 10 17:35:15 crc kubenswrapper[5036]: I0110 17:35:15.460522 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-5xsxl_fe5262aa-1fd6-4b80-a3ad-5bd9fa48cb69/controller/0.log" Jan 10 17:35:15 crc kubenswrapper[5036]: I0110 17:35:15.547647 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/cp-frr-files/0.log" Jan 10 17:35:15 crc kubenswrapper[5036]: I0110 17:35:15.694465 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/cp-frr-files/0.log" Jan 10 17:35:15 crc kubenswrapper[5036]: I0110 17:35:15.745583 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/cp-reloader/0.log" Jan 10 17:35:15 crc kubenswrapper[5036]: I0110 17:35:15.760452 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/cp-reloader/0.log" Jan 10 17:35:15 crc kubenswrapper[5036]: I0110 17:35:15.769310 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/cp-metrics/0.log" Jan 10 17:35:15 crc kubenswrapper[5036]: I0110 17:35:15.952630 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/cp-reloader/0.log" Jan 10 17:35:15 crc kubenswrapper[5036]: I0110 17:35:15.961976 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/cp-frr-files/0.log" Jan 10 17:35:15 crc kubenswrapper[5036]: I0110 17:35:15.971773 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/cp-metrics/0.log" Jan 10 17:35:15 crc kubenswrapper[5036]: I0110 17:35:15.977518 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/cp-metrics/0.log" Jan 10 17:35:16 crc kubenswrapper[5036]: I0110 17:35:16.139185 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/cp-frr-files/0.log" Jan 10 17:35:16 crc kubenswrapper[5036]: I0110 17:35:16.168012 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/cp-reloader/0.log" Jan 10 17:35:16 crc kubenswrapper[5036]: I0110 17:35:16.177176 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/cp-metrics/0.log" Jan 10 17:35:16 crc kubenswrapper[5036]: I0110 17:35:16.177539 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/controller/0.log" Jan 10 17:35:16 crc kubenswrapper[5036]: I0110 17:35:16.334412 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/frr-metrics/0.log" Jan 10 17:35:16 crc kubenswrapper[5036]: I0110 17:35:16.388237 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/kube-rbac-proxy-frr/0.log" Jan 10 17:35:16 crc kubenswrapper[5036]: I0110 17:35:16.395803 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/kube-rbac-proxy/0.log" Jan 10 17:35:16 crc kubenswrapper[5036]: I0110 17:35:16.518915 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/reloader/0.log" Jan 10 17:35:16 crc kubenswrapper[5036]: I0110 17:35:16.618556 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7784b6fcf-qzgnv_24cab86b-603a-48b4-9b8f-add5e9a79f7b/frr-k8s-webhook-server/0.log" Jan 10 17:35:16 crc kubenswrapper[5036]: I0110 17:35:16.861381 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-57fdf6dfbb-rvjhl_27d72d19-58d6-4094-8d3a-826354e6bb02/manager/0.log" Jan 10 17:35:16 crc kubenswrapper[5036]: I0110 17:35:16.934613 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-86bf866985-6ggxt_12be6fd4-c97c-439e-8a06-3769f37d7b48/webhook-server/0.log" Jan 10 17:35:17 crc kubenswrapper[5036]: I0110 17:35:17.049914 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bxfjm_6d60af08-1ea0-49e4-aa55-8f9bfa63b34b/kube-rbac-proxy/0.log" Jan 10 17:35:17 crc kubenswrapper[5036]: I0110 17:35:17.610814 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bxfjm_6d60af08-1ea0-49e4-aa55-8f9bfa63b34b/speaker/0.log" Jan 10 17:35:17 crc kubenswrapper[5036]: I0110 17:35:17.779768 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vvnr7_afae39b8-393f-46d1-a436-512d9ba68c25/frr/0.log" Jan 10 17:35:18 crc kubenswrapper[5036]: I0110 17:35:18.465312 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-skjhf" Jan 10 17:35:18 crc kubenswrapper[5036]: I0110 17:35:18.465720 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-skjhf" Jan 10 17:35:18 crc kubenswrapper[5036]: I0110 17:35:18.521206 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-skjhf" Jan 10 17:35:19 crc kubenswrapper[5036]: I0110 17:35:19.487498 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-skjhf" Jan 10 17:35:19 crc kubenswrapper[5036]: I0110 17:35:19.540842 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-skjhf"] Jan 10 17:35:21 crc kubenswrapper[5036]: I0110 17:35:21.441943 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-skjhf" podUID="41498079-4e4d-412f-b877-624e5473b06d" containerName="registry-server" containerID="cri-o://89133d3a62b280b8b769fffc8ecec0ff63e3d7daf87db6d2932dc2470d4b4f43" gracePeriod=2 Jan 10 17:35:21 crc kubenswrapper[5036]: I0110 17:35:21.913662 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-skjhf" Jan 10 17:35:22 crc kubenswrapper[5036]: I0110 17:35:22.086618 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41498079-4e4d-412f-b877-624e5473b06d-utilities\") pod \"41498079-4e4d-412f-b877-624e5473b06d\" (UID: \"41498079-4e4d-412f-b877-624e5473b06d\") " Jan 10 17:35:22 crc kubenswrapper[5036]: I0110 17:35:22.086696 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j446\" (UniqueName: \"kubernetes.io/projected/41498079-4e4d-412f-b877-624e5473b06d-kube-api-access-5j446\") pod \"41498079-4e4d-412f-b877-624e5473b06d\" (UID: \"41498079-4e4d-412f-b877-624e5473b06d\") " Jan 10 17:35:22 crc kubenswrapper[5036]: I0110 17:35:22.086799 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41498079-4e4d-412f-b877-624e5473b06d-catalog-content\") pod \"41498079-4e4d-412f-b877-624e5473b06d\" (UID: \"41498079-4e4d-412f-b877-624e5473b06d\") " Jan 10 17:35:22 crc kubenswrapper[5036]: I0110 17:35:22.087493 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41498079-4e4d-412f-b877-624e5473b06d-utilities" (OuterVolumeSpecName: "utilities") pod "41498079-4e4d-412f-b877-624e5473b06d" (UID: "41498079-4e4d-412f-b877-624e5473b06d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:35:22 crc kubenswrapper[5036]: I0110 17:35:22.107555 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41498079-4e4d-412f-b877-624e5473b06d-kube-api-access-5j446" (OuterVolumeSpecName: "kube-api-access-5j446") pod "41498079-4e4d-412f-b877-624e5473b06d" (UID: "41498079-4e4d-412f-b877-624e5473b06d"). InnerVolumeSpecName "kube-api-access-5j446". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:35:22 crc kubenswrapper[5036]: I0110 17:35:22.129349 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41498079-4e4d-412f-b877-624e5473b06d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41498079-4e4d-412f-b877-624e5473b06d" (UID: "41498079-4e4d-412f-b877-624e5473b06d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:35:22 crc kubenswrapper[5036]: I0110 17:35:22.189442 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41498079-4e4d-412f-b877-624e5473b06d-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 17:35:22 crc kubenswrapper[5036]: I0110 17:35:22.189483 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j446\" (UniqueName: \"kubernetes.io/projected/41498079-4e4d-412f-b877-624e5473b06d-kube-api-access-5j446\") on node \"crc\" DevicePath \"\"" Jan 10 17:35:22 crc kubenswrapper[5036]: I0110 17:35:22.189503 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41498079-4e4d-412f-b877-624e5473b06d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 17:35:22 crc kubenswrapper[5036]: I0110 17:35:22.450884 5036 generic.go:334] "Generic (PLEG): container finished" podID="41498079-4e4d-412f-b877-624e5473b06d" containerID="89133d3a62b280b8b769fffc8ecec0ff63e3d7daf87db6d2932dc2470d4b4f43" exitCode=0 Jan 10 17:35:22 crc kubenswrapper[5036]: I0110 17:35:22.450942 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-skjhf" Jan 10 17:35:22 crc kubenswrapper[5036]: I0110 17:35:22.450958 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skjhf" event={"ID":"41498079-4e4d-412f-b877-624e5473b06d","Type":"ContainerDied","Data":"89133d3a62b280b8b769fffc8ecec0ff63e3d7daf87db6d2932dc2470d4b4f43"} Jan 10 17:35:22 crc kubenswrapper[5036]: I0110 17:35:22.451266 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-skjhf" event={"ID":"41498079-4e4d-412f-b877-624e5473b06d","Type":"ContainerDied","Data":"2a73a94a85d7897e7c224b38bf9abe3e33b92681f2850e2a50b4c0bbad81ab01"} Jan 10 17:35:22 crc kubenswrapper[5036]: I0110 17:35:22.451290 5036 scope.go:117] "RemoveContainer" containerID="89133d3a62b280b8b769fffc8ecec0ff63e3d7daf87db6d2932dc2470d4b4f43" Jan 10 17:35:22 crc kubenswrapper[5036]: I0110 17:35:22.472087 5036 scope.go:117] "RemoveContainer" containerID="1e2f0bc518589788503cedc834e15c31db03cfaf7ca6a54f133afd7abf06f161" Jan 10 17:35:22 crc kubenswrapper[5036]: I0110 17:35:22.505831 5036 scope.go:117] "RemoveContainer" containerID="22b977913e0ca96cfafc1d9e7ba805366866ba17a32f9eca52bf46e9e1cc702d" Jan 10 17:35:22 crc kubenswrapper[5036]: I0110 17:35:22.519313 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-skjhf"] Jan 10 17:35:22 crc kubenswrapper[5036]: I0110 17:35:22.519347 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-skjhf"] Jan 10 17:35:22 crc kubenswrapper[5036]: I0110 17:35:22.569419 5036 scope.go:117] "RemoveContainer" containerID="89133d3a62b280b8b769fffc8ecec0ff63e3d7daf87db6d2932dc2470d4b4f43" Jan 10 17:35:22 crc kubenswrapper[5036]: E0110 17:35:22.569866 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89133d3a62b280b8b769fffc8ecec0ff63e3d7daf87db6d2932dc2470d4b4f43\": container with ID starting with 89133d3a62b280b8b769fffc8ecec0ff63e3d7daf87db6d2932dc2470d4b4f43 not found: ID does not exist" containerID="89133d3a62b280b8b769fffc8ecec0ff63e3d7daf87db6d2932dc2470d4b4f43" Jan 10 17:35:22 crc kubenswrapper[5036]: I0110 17:35:22.569905 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89133d3a62b280b8b769fffc8ecec0ff63e3d7daf87db6d2932dc2470d4b4f43"} err="failed to get container status \"89133d3a62b280b8b769fffc8ecec0ff63e3d7daf87db6d2932dc2470d4b4f43\": rpc error: code = NotFound desc = could not find container \"89133d3a62b280b8b769fffc8ecec0ff63e3d7daf87db6d2932dc2470d4b4f43\": container with ID starting with 89133d3a62b280b8b769fffc8ecec0ff63e3d7daf87db6d2932dc2470d4b4f43 not found: ID does not exist" Jan 10 17:35:22 crc kubenswrapper[5036]: I0110 17:35:22.569930 5036 scope.go:117] "RemoveContainer" containerID="1e2f0bc518589788503cedc834e15c31db03cfaf7ca6a54f133afd7abf06f161" Jan 10 17:35:22 crc kubenswrapper[5036]: E0110 17:35:22.570346 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e2f0bc518589788503cedc834e15c31db03cfaf7ca6a54f133afd7abf06f161\": container with ID starting with 1e2f0bc518589788503cedc834e15c31db03cfaf7ca6a54f133afd7abf06f161 not found: ID does not exist" containerID="1e2f0bc518589788503cedc834e15c31db03cfaf7ca6a54f133afd7abf06f161" Jan 10 17:35:22 crc kubenswrapper[5036]: I0110 17:35:22.570395 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e2f0bc518589788503cedc834e15c31db03cfaf7ca6a54f133afd7abf06f161"} err="failed to get container status \"1e2f0bc518589788503cedc834e15c31db03cfaf7ca6a54f133afd7abf06f161\": rpc error: code = NotFound desc = could not find container \"1e2f0bc518589788503cedc834e15c31db03cfaf7ca6a54f133afd7abf06f161\": container with ID starting with 1e2f0bc518589788503cedc834e15c31db03cfaf7ca6a54f133afd7abf06f161 not found: ID does not exist" Jan 10 17:35:22 crc kubenswrapper[5036]: I0110 17:35:22.570429 5036 scope.go:117] "RemoveContainer" containerID="22b977913e0ca96cfafc1d9e7ba805366866ba17a32f9eca52bf46e9e1cc702d" Jan 10 17:35:22 crc kubenswrapper[5036]: E0110 17:35:22.570765 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22b977913e0ca96cfafc1d9e7ba805366866ba17a32f9eca52bf46e9e1cc702d\": container with ID starting with 22b977913e0ca96cfafc1d9e7ba805366866ba17a32f9eca52bf46e9e1cc702d not found: ID does not exist" containerID="22b977913e0ca96cfafc1d9e7ba805366866ba17a32f9eca52bf46e9e1cc702d" Jan 10 17:35:22 crc kubenswrapper[5036]: I0110 17:35:22.570795 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22b977913e0ca96cfafc1d9e7ba805366866ba17a32f9eca52bf46e9e1cc702d"} err="failed to get container status \"22b977913e0ca96cfafc1d9e7ba805366866ba17a32f9eca52bf46e9e1cc702d\": rpc error: code = NotFound desc = could not find container \"22b977913e0ca96cfafc1d9e7ba805366866ba17a32f9eca52bf46e9e1cc702d\": container with ID starting with 22b977913e0ca96cfafc1d9e7ba805366866ba17a32f9eca52bf46e9e1cc702d not found: ID does not exist" Jan 10 17:35:24 crc kubenswrapper[5036]: I0110 17:35:24.522376 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41498079-4e4d-412f-b877-624e5473b06d" path="/var/lib/kubelet/pods/41498079-4e4d-412f-b877-624e5473b06d/volumes" Jan 10 17:35:26 crc kubenswrapper[5036]: I0110 17:35:26.509622 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" Jan 10 17:35:26 crc kubenswrapper[5036]: E0110 17:35:26.510599 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:35:31 crc kubenswrapper[5036]: I0110 17:35:31.539975 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms_56b82ef1-8690-4ba9-9ebe-1ce6b933df2b/util/0.log" Jan 10 17:35:31 crc kubenswrapper[5036]: I0110 17:35:31.740140 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms_56b82ef1-8690-4ba9-9ebe-1ce6b933df2b/util/0.log" Jan 10 17:35:31 crc kubenswrapper[5036]: I0110 17:35:31.767775 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms_56b82ef1-8690-4ba9-9ebe-1ce6b933df2b/pull/0.log" Jan 10 17:35:31 crc kubenswrapper[5036]: I0110 17:35:31.813362 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms_56b82ef1-8690-4ba9-9ebe-1ce6b933df2b/pull/0.log" Jan 10 17:35:31 crc kubenswrapper[5036]: I0110 17:35:31.953811 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms_56b82ef1-8690-4ba9-9ebe-1ce6b933df2b/util/0.log" Jan 10 17:35:31 crc kubenswrapper[5036]: I0110 17:35:31.954380 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms_56b82ef1-8690-4ba9-9ebe-1ce6b933df2b/extract/0.log" Jan 10 17:35:31 crc kubenswrapper[5036]: I0110 17:35:31.961321 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4ckgms_56b82ef1-8690-4ba9-9ebe-1ce6b933df2b/pull/0.log" Jan 10 17:35:32 crc kubenswrapper[5036]: I0110 17:35:32.132199 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4_ccb8fe79-0985-4f47-9885-cb6561c44e59/util/0.log" Jan 10 17:35:32 crc kubenswrapper[5036]: I0110 17:35:32.359666 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4_ccb8fe79-0985-4f47-9885-cb6561c44e59/util/0.log" Jan 10 17:35:32 crc kubenswrapper[5036]: I0110 17:35:32.359858 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4_ccb8fe79-0985-4f47-9885-cb6561c44e59/pull/0.log" Jan 10 17:35:32 crc kubenswrapper[5036]: I0110 17:35:32.400531 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4_ccb8fe79-0985-4f47-9885-cb6561c44e59/pull/0.log" Jan 10 17:35:32 crc kubenswrapper[5036]: I0110 17:35:32.508359 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4_ccb8fe79-0985-4f47-9885-cb6561c44e59/util/0.log" Jan 10 17:35:32 crc kubenswrapper[5036]: I0110 17:35:32.549034 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4_ccb8fe79-0985-4f47-9885-cb6561c44e59/pull/0.log" Jan 10 17:35:32 crc kubenswrapper[5036]: I0110 17:35:32.603383 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8h27s4_ccb8fe79-0985-4f47-9885-cb6561c44e59/extract/0.log" Jan 10 17:35:32 crc kubenswrapper[5036]: I0110 17:35:32.693378 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nfqrz_3195b346-8a73-4e01-9842-5a7fde228f6e/extract-utilities/0.log" Jan 10 17:35:32 crc kubenswrapper[5036]: I0110 17:35:32.875950 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nfqrz_3195b346-8a73-4e01-9842-5a7fde228f6e/extract-utilities/0.log" Jan 10 17:35:32 crc kubenswrapper[5036]: I0110 17:35:32.908357 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nfqrz_3195b346-8a73-4e01-9842-5a7fde228f6e/extract-content/0.log" Jan 10 17:35:32 crc kubenswrapper[5036]: I0110 17:35:32.929905 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nfqrz_3195b346-8a73-4e01-9842-5a7fde228f6e/extract-content/0.log" Jan 10 17:35:33 crc kubenswrapper[5036]: I0110 17:35:33.151475 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nfqrz_3195b346-8a73-4e01-9842-5a7fde228f6e/extract-utilities/0.log" Jan 10 17:35:33 crc kubenswrapper[5036]: I0110 17:35:33.179039 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nfqrz_3195b346-8a73-4e01-9842-5a7fde228f6e/extract-content/0.log" Jan 10 17:35:33 crc kubenswrapper[5036]: I0110 17:35:33.408467 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khg2q_d18f574b-ac33-4b60-bcbc-856b463b231a/extract-utilities/0.log" Jan 10 17:35:33 crc kubenswrapper[5036]: I0110 17:35:33.529481 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khg2q_d18f574b-ac33-4b60-bcbc-856b463b231a/extract-utilities/0.log" Jan 10 17:35:33 crc kubenswrapper[5036]: I0110 17:35:33.568387 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nfqrz_3195b346-8a73-4e01-9842-5a7fde228f6e/registry-server/0.log" Jan 10 17:35:33 crc kubenswrapper[5036]: I0110 17:35:33.607709 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khg2q_d18f574b-ac33-4b60-bcbc-856b463b231a/extract-content/0.log" Jan 10 17:35:33 crc kubenswrapper[5036]: I0110 17:35:33.629459 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khg2q_d18f574b-ac33-4b60-bcbc-856b463b231a/extract-content/0.log" Jan 10 17:35:33 crc kubenswrapper[5036]: I0110 17:35:33.785277 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khg2q_d18f574b-ac33-4b60-bcbc-856b463b231a/extract-content/0.log" Jan 10 17:35:33 crc kubenswrapper[5036]: I0110 17:35:33.798932 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khg2q_d18f574b-ac33-4b60-bcbc-856b463b231a/extract-utilities/0.log" Jan 10 17:35:34 crc kubenswrapper[5036]: I0110 17:35:34.007413 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-gm65z_d8de44e3-ed07-4c76-8aa8-2265c9cd1805/marketplace-operator/0.log" Jan 10 17:35:34 crc kubenswrapper[5036]: I0110 17:35:34.235302 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-28lcz_dd0fc5aa-292a-4009-8cdf-0534293491f3/extract-utilities/0.log" Jan 10 17:35:34 crc kubenswrapper[5036]: I0110 17:35:34.256105 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khg2q_d18f574b-ac33-4b60-bcbc-856b463b231a/registry-server/0.log" Jan 10 17:35:34 crc kubenswrapper[5036]: I0110 17:35:34.429848 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-28lcz_dd0fc5aa-292a-4009-8cdf-0534293491f3/extract-utilities/0.log" Jan 10 17:35:34 crc kubenswrapper[5036]: I0110 17:35:34.430669 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-28lcz_dd0fc5aa-292a-4009-8cdf-0534293491f3/extract-content/0.log" Jan 10 17:35:34 crc kubenswrapper[5036]: I0110 17:35:34.529615 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-28lcz_dd0fc5aa-292a-4009-8cdf-0534293491f3/extract-content/0.log" Jan 10 17:35:34 crc kubenswrapper[5036]: I0110 17:35:34.818359 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-28lcz_dd0fc5aa-292a-4009-8cdf-0534293491f3/extract-content/0.log" Jan 10 17:35:34 crc kubenswrapper[5036]: I0110 17:35:34.908293 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-28lcz_dd0fc5aa-292a-4009-8cdf-0534293491f3/extract-utilities/0.log" Jan 10 17:35:35 crc kubenswrapper[5036]: I0110 17:35:35.042041 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-28lcz_dd0fc5aa-292a-4009-8cdf-0534293491f3/registry-server/0.log" Jan 10 17:35:35 crc kubenswrapper[5036]: I0110 17:35:35.083168 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vdpgl_91cf1499-408e-4bc3-b3ae-8f435079b904/extract-utilities/0.log" Jan 10 17:35:35 crc kubenswrapper[5036]: I0110 17:35:35.306837 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vdpgl_91cf1499-408e-4bc3-b3ae-8f435079b904/extract-content/0.log" Jan 10 17:35:35 crc kubenswrapper[5036]: I0110 17:35:35.325781 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vdpgl_91cf1499-408e-4bc3-b3ae-8f435079b904/extract-content/0.log" Jan 10 17:35:35 crc kubenswrapper[5036]: I0110 17:35:35.365262 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vdpgl_91cf1499-408e-4bc3-b3ae-8f435079b904/extract-utilities/0.log" Jan 10 17:35:35 crc kubenswrapper[5036]: I0110 17:35:35.499134 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vdpgl_91cf1499-408e-4bc3-b3ae-8f435079b904/extract-utilities/0.log" Jan 10 17:35:35 crc kubenswrapper[5036]: I0110 17:35:35.528409 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vdpgl_91cf1499-408e-4bc3-b3ae-8f435079b904/extract-content/0.log" Jan 10 17:35:35 crc kubenswrapper[5036]: I0110 17:35:35.991507 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vdpgl_91cf1499-408e-4bc3-b3ae-8f435079b904/registry-server/0.log" Jan 10 17:35:40 crc kubenswrapper[5036]: I0110 17:35:40.508808 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" Jan 10 17:35:40 crc kubenswrapper[5036]: E0110 17:35:40.509417 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:35:52 crc kubenswrapper[5036]: I0110 17:35:52.508162 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" Jan 10 17:35:52 crc kubenswrapper[5036]: E0110 17:35:52.509013 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:35:59 crc kubenswrapper[5036]: I0110 17:35:59.702087 5036 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4f2h8"] Jan 10 17:35:59 crc kubenswrapper[5036]: E0110 17:35:59.705842 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41498079-4e4d-412f-b877-624e5473b06d" containerName="extract-content" Jan 10 17:35:59 crc kubenswrapper[5036]: I0110 17:35:59.705952 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="41498079-4e4d-412f-b877-624e5473b06d" containerName="extract-content" Jan 10 17:35:59 crc kubenswrapper[5036]: E0110 17:35:59.706065 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41498079-4e4d-412f-b877-624e5473b06d" containerName="registry-server" Jan 10 17:35:59 crc kubenswrapper[5036]: I0110 17:35:59.706136 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="41498079-4e4d-412f-b877-624e5473b06d" containerName="registry-server" Jan 10 17:35:59 crc kubenswrapper[5036]: E0110 17:35:59.706193 5036 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41498079-4e4d-412f-b877-624e5473b06d" containerName="extract-utilities" Jan 10 17:35:59 crc kubenswrapper[5036]: I0110 17:35:59.706261 5036 state_mem.go:107] "Deleted CPUSet assignment" podUID="41498079-4e4d-412f-b877-624e5473b06d" containerName="extract-utilities" Jan 10 17:35:59 crc kubenswrapper[5036]: I0110 17:35:59.706524 5036 memory_manager.go:354] "RemoveStaleState removing state" podUID="41498079-4e4d-412f-b877-624e5473b06d" containerName="registry-server" Jan 10 17:35:59 crc kubenswrapper[5036]: I0110 17:35:59.708744 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4f2h8" Jan 10 17:35:59 crc kubenswrapper[5036]: I0110 17:35:59.713076 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4f2h8"] Jan 10 17:35:59 crc kubenswrapper[5036]: I0110 17:35:59.893606 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81227eaa-bf9d-452f-83de-6965bb362f55-utilities\") pod \"community-operators-4f2h8\" (UID: \"81227eaa-bf9d-452f-83de-6965bb362f55\") " pod="openshift-marketplace/community-operators-4f2h8" Jan 10 17:35:59 crc kubenswrapper[5036]: I0110 17:35:59.894021 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81227eaa-bf9d-452f-83de-6965bb362f55-catalog-content\") pod \"community-operators-4f2h8\" (UID: \"81227eaa-bf9d-452f-83de-6965bb362f55\") " pod="openshift-marketplace/community-operators-4f2h8" Jan 10 17:35:59 crc kubenswrapper[5036]: I0110 17:35:59.894189 5036 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf69r\" (UniqueName: \"kubernetes.io/projected/81227eaa-bf9d-452f-83de-6965bb362f55-kube-api-access-lf69r\") pod \"community-operators-4f2h8\" (UID: \"81227eaa-bf9d-452f-83de-6965bb362f55\") " pod="openshift-marketplace/community-operators-4f2h8" Jan 10 17:35:59 crc kubenswrapper[5036]: I0110 17:35:59.995913 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf69r\" (UniqueName: \"kubernetes.io/projected/81227eaa-bf9d-452f-83de-6965bb362f55-kube-api-access-lf69r\") pod \"community-operators-4f2h8\" (UID: \"81227eaa-bf9d-452f-83de-6965bb362f55\") " pod="openshift-marketplace/community-operators-4f2h8" Jan 10 17:35:59 crc kubenswrapper[5036]: I0110 17:35:59.996080 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81227eaa-bf9d-452f-83de-6965bb362f55-utilities\") pod \"community-operators-4f2h8\" (UID: \"81227eaa-bf9d-452f-83de-6965bb362f55\") " pod="openshift-marketplace/community-operators-4f2h8" Jan 10 17:35:59 crc kubenswrapper[5036]: I0110 17:35:59.996153 5036 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81227eaa-bf9d-452f-83de-6965bb362f55-catalog-content\") pod \"community-operators-4f2h8\" (UID: \"81227eaa-bf9d-452f-83de-6965bb362f55\") " pod="openshift-marketplace/community-operators-4f2h8" Jan 10 17:35:59 crc kubenswrapper[5036]: I0110 17:35:59.996768 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81227eaa-bf9d-452f-83de-6965bb362f55-utilities\") pod \"community-operators-4f2h8\" (UID: \"81227eaa-bf9d-452f-83de-6965bb362f55\") " pod="openshift-marketplace/community-operators-4f2h8" Jan 10 17:35:59 crc kubenswrapper[5036]: I0110 17:35:59.996812 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81227eaa-bf9d-452f-83de-6965bb362f55-catalog-content\") pod \"community-operators-4f2h8\" (UID: \"81227eaa-bf9d-452f-83de-6965bb362f55\") " pod="openshift-marketplace/community-operators-4f2h8" Jan 10 17:36:00 crc kubenswrapper[5036]: I0110 17:36:00.022846 5036 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf69r\" (UniqueName: \"kubernetes.io/projected/81227eaa-bf9d-452f-83de-6965bb362f55-kube-api-access-lf69r\") pod \"community-operators-4f2h8\" (UID: \"81227eaa-bf9d-452f-83de-6965bb362f55\") " pod="openshift-marketplace/community-operators-4f2h8" Jan 10 17:36:00 crc kubenswrapper[5036]: I0110 17:36:00.024810 5036 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4f2h8" Jan 10 17:36:00 crc kubenswrapper[5036]: I0110 17:36:00.669853 5036 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4f2h8"] Jan 10 17:36:00 crc kubenswrapper[5036]: W0110 17:36:00.698316 5036 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81227eaa_bf9d_452f_83de_6965bb362f55.slice/crio-cd476f705a6f73bd44b95b3f261680f6db0cb69afafb1cbd0b1713afa393cb5c WatchSource:0}: Error finding container cd476f705a6f73bd44b95b3f261680f6db0cb69afafb1cbd0b1713afa393cb5c: Status 404 returned error can't find the container with id cd476f705a6f73bd44b95b3f261680f6db0cb69afafb1cbd0b1713afa393cb5c Jan 10 17:36:00 crc kubenswrapper[5036]: I0110 17:36:00.843072 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4f2h8" event={"ID":"81227eaa-bf9d-452f-83de-6965bb362f55","Type":"ContainerStarted","Data":"cd476f705a6f73bd44b95b3f261680f6db0cb69afafb1cbd0b1713afa393cb5c"} Jan 10 17:36:01 crc kubenswrapper[5036]: I0110 17:36:01.858526 5036 generic.go:334] "Generic (PLEG): container finished" podID="81227eaa-bf9d-452f-83de-6965bb362f55" containerID="a5f70d89dcd0fd256ee31f49e1da352744c76b4bf9f7da0e77b10226bed6e38f" exitCode=0 Jan 10 17:36:01 crc kubenswrapper[5036]: I0110 17:36:01.858587 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4f2h8" event={"ID":"81227eaa-bf9d-452f-83de-6965bb362f55","Type":"ContainerDied","Data":"a5f70d89dcd0fd256ee31f49e1da352744c76b4bf9f7da0e77b10226bed6e38f"} Jan 10 17:36:03 crc kubenswrapper[5036]: I0110 17:36:03.882102 5036 generic.go:334] "Generic (PLEG): container finished" podID="81227eaa-bf9d-452f-83de-6965bb362f55" containerID="c1feb4ac336551e012730c7fb672cfd8c768430453bcad33a042356cd17d0b12" exitCode=0 Jan 10 17:36:03 crc kubenswrapper[5036]: I0110 17:36:03.882520 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4f2h8" event={"ID":"81227eaa-bf9d-452f-83de-6965bb362f55","Type":"ContainerDied","Data":"c1feb4ac336551e012730c7fb672cfd8c768430453bcad33a042356cd17d0b12"} Jan 10 17:36:04 crc kubenswrapper[5036]: I0110 17:36:04.891461 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4f2h8" event={"ID":"81227eaa-bf9d-452f-83de-6965bb362f55","Type":"ContainerStarted","Data":"d2db743183a86255608e02ac077ea5c92829fbf2cb9904b29b16e7ad14e18787"} Jan 10 17:36:04 crc kubenswrapper[5036]: I0110 17:36:04.907207 5036 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4f2h8" podStartSLOduration=3.4515989080000002 podStartE2EDuration="5.90719424s" podCreationTimestamp="2026-01-10 17:35:59 +0000 UTC" firstStartedPulling="2026-01-10 17:36:01.862178989 +0000 UTC m=+4083.732414483" lastFinishedPulling="2026-01-10 17:36:04.317774321 +0000 UTC m=+4086.188009815" observedRunningTime="2026-01-10 17:36:04.904952827 +0000 UTC m=+4086.775188321" watchObservedRunningTime="2026-01-10 17:36:04.90719424 +0000 UTC m=+4086.777429734" Jan 10 17:36:05 crc kubenswrapper[5036]: I0110 17:36:05.508610 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" Jan 10 17:36:05 crc kubenswrapper[5036]: E0110 17:36:05.509275 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:36:10 crc kubenswrapper[5036]: I0110 17:36:10.025023 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4f2h8" Jan 10 17:36:10 crc kubenswrapper[5036]: I0110 17:36:10.025514 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4f2h8" Jan 10 17:36:10 crc kubenswrapper[5036]: I0110 17:36:10.076758 5036 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4f2h8" Jan 10 17:36:11 crc kubenswrapper[5036]: I0110 17:36:11.002275 5036 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4f2h8" Jan 10 17:36:11 crc kubenswrapper[5036]: I0110 17:36:11.053724 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4f2h8"] Jan 10 17:36:12 crc kubenswrapper[5036]: I0110 17:36:12.961760 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4f2h8" podUID="81227eaa-bf9d-452f-83de-6965bb362f55" containerName="registry-server" containerID="cri-o://d2db743183a86255608e02ac077ea5c92829fbf2cb9904b29b16e7ad14e18787" gracePeriod=2 Jan 10 17:36:13 crc kubenswrapper[5036]: I0110 17:36:13.516778 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4f2h8" Jan 10 17:36:13 crc kubenswrapper[5036]: I0110 17:36:13.626321 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81227eaa-bf9d-452f-83de-6965bb362f55-catalog-content\") pod \"81227eaa-bf9d-452f-83de-6965bb362f55\" (UID: \"81227eaa-bf9d-452f-83de-6965bb362f55\") " Jan 10 17:36:13 crc kubenswrapper[5036]: I0110 17:36:13.626526 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81227eaa-bf9d-452f-83de-6965bb362f55-utilities\") pod \"81227eaa-bf9d-452f-83de-6965bb362f55\" (UID: \"81227eaa-bf9d-452f-83de-6965bb362f55\") " Jan 10 17:36:13 crc kubenswrapper[5036]: I0110 17:36:13.626578 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lf69r\" (UniqueName: \"kubernetes.io/projected/81227eaa-bf9d-452f-83de-6965bb362f55-kube-api-access-lf69r\") pod \"81227eaa-bf9d-452f-83de-6965bb362f55\" (UID: \"81227eaa-bf9d-452f-83de-6965bb362f55\") " Jan 10 17:36:13 crc kubenswrapper[5036]: I0110 17:36:13.628865 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81227eaa-bf9d-452f-83de-6965bb362f55-utilities" (OuterVolumeSpecName: "utilities") pod "81227eaa-bf9d-452f-83de-6965bb362f55" (UID: "81227eaa-bf9d-452f-83de-6965bb362f55"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:36:13 crc kubenswrapper[5036]: I0110 17:36:13.648821 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81227eaa-bf9d-452f-83de-6965bb362f55-kube-api-access-lf69r" (OuterVolumeSpecName: "kube-api-access-lf69r") pod "81227eaa-bf9d-452f-83de-6965bb362f55" (UID: "81227eaa-bf9d-452f-83de-6965bb362f55"). InnerVolumeSpecName "kube-api-access-lf69r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:36:13 crc kubenswrapper[5036]: I0110 17:36:13.674918 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81227eaa-bf9d-452f-83de-6965bb362f55-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81227eaa-bf9d-452f-83de-6965bb362f55" (UID: "81227eaa-bf9d-452f-83de-6965bb362f55"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:36:13 crc kubenswrapper[5036]: I0110 17:36:13.728798 5036 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81227eaa-bf9d-452f-83de-6965bb362f55-utilities\") on node \"crc\" DevicePath \"\"" Jan 10 17:36:13 crc kubenswrapper[5036]: I0110 17:36:13.728975 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lf69r\" (UniqueName: \"kubernetes.io/projected/81227eaa-bf9d-452f-83de-6965bb362f55-kube-api-access-lf69r\") on node \"crc\" DevicePath \"\"" Jan 10 17:36:13 crc kubenswrapper[5036]: I0110 17:36:13.729041 5036 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81227eaa-bf9d-452f-83de-6965bb362f55-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 10 17:36:13 crc kubenswrapper[5036]: E0110 17:36:13.918020 5036 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.83:49256->38.102.83.83:37657: write tcp 38.102.83.83:49256->38.102.83.83:37657: write: broken pipe Jan 10 17:36:13 crc kubenswrapper[5036]: I0110 17:36:13.972360 5036 generic.go:334] "Generic (PLEG): container finished" podID="81227eaa-bf9d-452f-83de-6965bb362f55" containerID="d2db743183a86255608e02ac077ea5c92829fbf2cb9904b29b16e7ad14e18787" exitCode=0 Jan 10 17:36:13 crc kubenswrapper[5036]: I0110 17:36:13.972409 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4f2h8" Jan 10 17:36:13 crc kubenswrapper[5036]: I0110 17:36:13.972427 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4f2h8" event={"ID":"81227eaa-bf9d-452f-83de-6965bb362f55","Type":"ContainerDied","Data":"d2db743183a86255608e02ac077ea5c92829fbf2cb9904b29b16e7ad14e18787"} Jan 10 17:36:13 crc kubenswrapper[5036]: I0110 17:36:13.974024 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4f2h8" event={"ID":"81227eaa-bf9d-452f-83de-6965bb362f55","Type":"ContainerDied","Data":"cd476f705a6f73bd44b95b3f261680f6db0cb69afafb1cbd0b1713afa393cb5c"} Jan 10 17:36:13 crc kubenswrapper[5036]: I0110 17:36:13.974071 5036 scope.go:117] "RemoveContainer" containerID="d2db743183a86255608e02ac077ea5c92829fbf2cb9904b29b16e7ad14e18787" Jan 10 17:36:13 crc kubenswrapper[5036]: I0110 17:36:13.996060 5036 scope.go:117] "RemoveContainer" containerID="c1feb4ac336551e012730c7fb672cfd8c768430453bcad33a042356cd17d0b12" Jan 10 17:36:14 crc kubenswrapper[5036]: I0110 17:36:14.020596 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4f2h8"] Jan 10 17:36:14 crc kubenswrapper[5036]: I0110 17:36:14.034893 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4f2h8"] Jan 10 17:36:14 crc kubenswrapper[5036]: I0110 17:36:14.035358 5036 scope.go:117] "RemoveContainer" containerID="a5f70d89dcd0fd256ee31f49e1da352744c76b4bf9f7da0e77b10226bed6e38f" Jan 10 17:36:14 crc kubenswrapper[5036]: I0110 17:36:14.067919 5036 scope.go:117] "RemoveContainer" containerID="d2db743183a86255608e02ac077ea5c92829fbf2cb9904b29b16e7ad14e18787" Jan 10 17:36:14 crc kubenswrapper[5036]: E0110 17:36:14.068354 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2db743183a86255608e02ac077ea5c92829fbf2cb9904b29b16e7ad14e18787\": container with ID starting with d2db743183a86255608e02ac077ea5c92829fbf2cb9904b29b16e7ad14e18787 not found: ID does not exist" containerID="d2db743183a86255608e02ac077ea5c92829fbf2cb9904b29b16e7ad14e18787" Jan 10 17:36:14 crc kubenswrapper[5036]: I0110 17:36:14.068381 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2db743183a86255608e02ac077ea5c92829fbf2cb9904b29b16e7ad14e18787"} err="failed to get container status \"d2db743183a86255608e02ac077ea5c92829fbf2cb9904b29b16e7ad14e18787\": rpc error: code = NotFound desc = could not find container \"d2db743183a86255608e02ac077ea5c92829fbf2cb9904b29b16e7ad14e18787\": container with ID starting with d2db743183a86255608e02ac077ea5c92829fbf2cb9904b29b16e7ad14e18787 not found: ID does not exist" Jan 10 17:36:14 crc kubenswrapper[5036]: I0110 17:36:14.068401 5036 scope.go:117] "RemoveContainer" containerID="c1feb4ac336551e012730c7fb672cfd8c768430453bcad33a042356cd17d0b12" Jan 10 17:36:14 crc kubenswrapper[5036]: E0110 17:36:14.068724 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1feb4ac336551e012730c7fb672cfd8c768430453bcad33a042356cd17d0b12\": container with ID starting with c1feb4ac336551e012730c7fb672cfd8c768430453bcad33a042356cd17d0b12 not found: ID does not exist" containerID="c1feb4ac336551e012730c7fb672cfd8c768430453bcad33a042356cd17d0b12" Jan 10 17:36:14 crc kubenswrapper[5036]: I0110 17:36:14.068751 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1feb4ac336551e012730c7fb672cfd8c768430453bcad33a042356cd17d0b12"} err="failed to get container status \"c1feb4ac336551e012730c7fb672cfd8c768430453bcad33a042356cd17d0b12\": rpc error: code = NotFound desc = could not find container \"c1feb4ac336551e012730c7fb672cfd8c768430453bcad33a042356cd17d0b12\": container with ID starting with c1feb4ac336551e012730c7fb672cfd8c768430453bcad33a042356cd17d0b12 not found: ID does not exist" Jan 10 17:36:14 crc kubenswrapper[5036]: I0110 17:36:14.068764 5036 scope.go:117] "RemoveContainer" containerID="a5f70d89dcd0fd256ee31f49e1da352744c76b4bf9f7da0e77b10226bed6e38f" Jan 10 17:36:14 crc kubenswrapper[5036]: E0110 17:36:14.070070 5036 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5f70d89dcd0fd256ee31f49e1da352744c76b4bf9f7da0e77b10226bed6e38f\": container with ID starting with a5f70d89dcd0fd256ee31f49e1da352744c76b4bf9f7da0e77b10226bed6e38f not found: ID does not exist" containerID="a5f70d89dcd0fd256ee31f49e1da352744c76b4bf9f7da0e77b10226bed6e38f" Jan 10 17:36:14 crc kubenswrapper[5036]: I0110 17:36:14.070120 5036 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5f70d89dcd0fd256ee31f49e1da352744c76b4bf9f7da0e77b10226bed6e38f"} err="failed to get container status \"a5f70d89dcd0fd256ee31f49e1da352744c76b4bf9f7da0e77b10226bed6e38f\": rpc error: code = NotFound desc = could not find container \"a5f70d89dcd0fd256ee31f49e1da352744c76b4bf9f7da0e77b10226bed6e38f\": container with ID starting with a5f70d89dcd0fd256ee31f49e1da352744c76b4bf9f7da0e77b10226bed6e38f not found: ID does not exist" Jan 10 17:36:14 crc kubenswrapper[5036]: I0110 17:36:14.520424 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81227eaa-bf9d-452f-83de-6965bb362f55" path="/var/lib/kubelet/pods/81227eaa-bf9d-452f-83de-6965bb362f55/volumes" Jan 10 17:36:17 crc kubenswrapper[5036]: I0110 17:36:17.510286 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" Jan 10 17:36:17 crc kubenswrapper[5036]: E0110 17:36:17.511420 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:36:31 crc kubenswrapper[5036]: I0110 17:36:31.508442 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" Jan 10 17:36:31 crc kubenswrapper[5036]: E0110 17:36:31.509557 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:36:46 crc kubenswrapper[5036]: I0110 17:36:46.508777 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" Jan 10 17:36:46 crc kubenswrapper[5036]: E0110 17:36:46.509892 5036 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kqphb_openshift-machine-config-operator(79756361-741e-4470-831b-6ee092bc6277)\"" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" Jan 10 17:37:01 crc kubenswrapper[5036]: I0110 17:37:01.508032 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" Jan 10 17:37:02 crc kubenswrapper[5036]: I0110 17:37:02.546354 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerStarted","Data":"bd5be67494c19f7db45abe4d993ffa6c70f6eb5a2fe59a988ba20bdfb387a56a"} Jan 10 17:37:22 crc kubenswrapper[5036]: I0110 17:37:22.800907 5036 generic.go:334] "Generic (PLEG): container finished" podID="689c3546-b16e-4265-9e6e-57ce3915c006" containerID="a8d3bd8cc11e3892c16acaa91326238c71e75b5a9024f216405f776357c99034" exitCode=0 Jan 10 17:37:22 crc kubenswrapper[5036]: I0110 17:37:22.801002 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9xk8v/must-gather-x4bc8" event={"ID":"689c3546-b16e-4265-9e6e-57ce3915c006","Type":"ContainerDied","Data":"a8d3bd8cc11e3892c16acaa91326238c71e75b5a9024f216405f776357c99034"} Jan 10 17:37:22 crc kubenswrapper[5036]: I0110 17:37:22.802195 5036 scope.go:117] "RemoveContainer" containerID="a8d3bd8cc11e3892c16acaa91326238c71e75b5a9024f216405f776357c99034" Jan 10 17:37:23 crc kubenswrapper[5036]: I0110 17:37:23.853472 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9xk8v_must-gather-x4bc8_689c3546-b16e-4265-9e6e-57ce3915c006/gather/0.log" Jan 10 17:37:35 crc kubenswrapper[5036]: I0110 17:37:35.670455 5036 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9xk8v/must-gather-x4bc8"] Jan 10 17:37:35 crc kubenswrapper[5036]: I0110 17:37:35.671169 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-9xk8v/must-gather-x4bc8" podUID="689c3546-b16e-4265-9e6e-57ce3915c006" containerName="copy" containerID="cri-o://ed6c84dd9322b966a9bcbeac915c61b4480c44e10f9a68a1c9abda1a5ee0740f" gracePeriod=2 Jan 10 17:37:35 crc kubenswrapper[5036]: I0110 17:37:35.685246 5036 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9xk8v/must-gather-x4bc8"] Jan 10 17:37:35 crc kubenswrapper[5036]: I0110 17:37:35.954019 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9xk8v_must-gather-x4bc8_689c3546-b16e-4265-9e6e-57ce3915c006/copy/0.log" Jan 10 17:37:35 crc kubenswrapper[5036]: I0110 17:37:35.954922 5036 generic.go:334] "Generic (PLEG): container finished" podID="689c3546-b16e-4265-9e6e-57ce3915c006" containerID="ed6c84dd9322b966a9bcbeac915c61b4480c44e10f9a68a1c9abda1a5ee0740f" exitCode=143 Jan 10 17:37:36 crc kubenswrapper[5036]: I0110 17:37:36.120463 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9xk8v_must-gather-x4bc8_689c3546-b16e-4265-9e6e-57ce3915c006/copy/0.log" Jan 10 17:37:36 crc kubenswrapper[5036]: I0110 17:37:36.120996 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xk8v/must-gather-x4bc8" Jan 10 17:37:36 crc kubenswrapper[5036]: I0110 17:37:36.157358 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/689c3546-b16e-4265-9e6e-57ce3915c006-must-gather-output\") pod \"689c3546-b16e-4265-9e6e-57ce3915c006\" (UID: \"689c3546-b16e-4265-9e6e-57ce3915c006\") " Jan 10 17:37:36 crc kubenswrapper[5036]: I0110 17:37:36.157478 5036 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9qtd\" (UniqueName: \"kubernetes.io/projected/689c3546-b16e-4265-9e6e-57ce3915c006-kube-api-access-x9qtd\") pod \"689c3546-b16e-4265-9e6e-57ce3915c006\" (UID: \"689c3546-b16e-4265-9e6e-57ce3915c006\") " Jan 10 17:37:36 crc kubenswrapper[5036]: I0110 17:37:36.258293 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/689c3546-b16e-4265-9e6e-57ce3915c006-kube-api-access-x9qtd" (OuterVolumeSpecName: "kube-api-access-x9qtd") pod "689c3546-b16e-4265-9e6e-57ce3915c006" (UID: "689c3546-b16e-4265-9e6e-57ce3915c006"). InnerVolumeSpecName "kube-api-access-x9qtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 10 17:37:36 crc kubenswrapper[5036]: I0110 17:37:36.260370 5036 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9qtd\" (UniqueName: \"kubernetes.io/projected/689c3546-b16e-4265-9e6e-57ce3915c006-kube-api-access-x9qtd\") on node \"crc\" DevicePath \"\"" Jan 10 17:37:36 crc kubenswrapper[5036]: I0110 17:37:36.337887 5036 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/689c3546-b16e-4265-9e6e-57ce3915c006-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "689c3546-b16e-4265-9e6e-57ce3915c006" (UID: "689c3546-b16e-4265-9e6e-57ce3915c006"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 10 17:37:36 crc kubenswrapper[5036]: I0110 17:37:36.373151 5036 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/689c3546-b16e-4265-9e6e-57ce3915c006-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 10 17:37:36 crc kubenswrapper[5036]: I0110 17:37:36.520219 5036 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="689c3546-b16e-4265-9e6e-57ce3915c006" path="/var/lib/kubelet/pods/689c3546-b16e-4265-9e6e-57ce3915c006/volumes" Jan 10 17:37:36 crc kubenswrapper[5036]: I0110 17:37:36.964228 5036 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9xk8v_must-gather-x4bc8_689c3546-b16e-4265-9e6e-57ce3915c006/copy/0.log" Jan 10 17:37:36 crc kubenswrapper[5036]: I0110 17:37:36.964579 5036 scope.go:117] "RemoveContainer" containerID="ed6c84dd9322b966a9bcbeac915c61b4480c44e10f9a68a1c9abda1a5ee0740f" Jan 10 17:37:36 crc kubenswrapper[5036]: I0110 17:37:36.964729 5036 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9xk8v/must-gather-x4bc8" Jan 10 17:37:36 crc kubenswrapper[5036]: I0110 17:37:36.982893 5036 scope.go:117] "RemoveContainer" containerID="a8d3bd8cc11e3892c16acaa91326238c71e75b5a9024f216405f776357c99034" Jan 10 17:38:25 crc kubenswrapper[5036]: I0110 17:38:25.552094 5036 scope.go:117] "RemoveContainer" containerID="f8df9c4bf82431c198010ff33aa75a9615fdae113af9b8e769410e44edc416e6" Jan 10 17:38:25 crc kubenswrapper[5036]: I0110 17:38:25.584125 5036 scope.go:117] "RemoveContainer" containerID="f1f3b021646b558abb05629a929e5420aa688a0eba9ff529894b440c9532b1d4" Jan 10 17:38:25 crc kubenswrapper[5036]: I0110 17:38:25.648969 5036 scope.go:117] "RemoveContainer" containerID="4e3ef8b194ac151f09190dab31c348f9fd1f383e3d7e4bba4a38812fe0f027f3" Jan 10 17:38:25 crc kubenswrapper[5036]: I0110 17:38:25.694962 5036 scope.go:117] "RemoveContainer" containerID="d09bdb3e2f4de3cf032352db23540827ee9fb2540bff92579be768c1a4b7b230" Jan 10 17:38:25 crc kubenswrapper[5036]: I0110 17:38:25.721068 5036 scope.go:117] "RemoveContainer" containerID="8f17873de3a838cec6d7ccc126b02773e93b8836f29e2234204b0172ad4e9c6c" Jan 10 17:39:25 crc kubenswrapper[5036]: I0110 17:39:25.904303 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 17:39:25 crc kubenswrapper[5036]: I0110 17:39:25.904971 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 17:39:55 crc kubenswrapper[5036]: I0110 17:39:55.905234 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 17:39:55 crc kubenswrapper[5036]: I0110 17:39:55.906294 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 17:40:25 crc kubenswrapper[5036]: I0110 17:40:25.904664 5036 patch_prober.go:28] interesting pod/machine-config-daemon-kqphb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 10 17:40:25 crc kubenswrapper[5036]: I0110 17:40:25.905599 5036 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 10 17:40:25 crc kubenswrapper[5036]: I0110 17:40:25.905719 5036 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" Jan 10 17:40:25 crc kubenswrapper[5036]: I0110 17:40:25.906906 5036 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bd5be67494c19f7db45abe4d993ffa6c70f6eb5a2fe59a988ba20bdfb387a56a"} pod="openshift-machine-config-operator/machine-config-daemon-kqphb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 10 17:40:25 crc kubenswrapper[5036]: I0110 17:40:25.907005 5036 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" podUID="79756361-741e-4470-831b-6ee092bc6277" containerName="machine-config-daemon" containerID="cri-o://bd5be67494c19f7db45abe4d993ffa6c70f6eb5a2fe59a988ba20bdfb387a56a" gracePeriod=600 Jan 10 17:40:26 crc kubenswrapper[5036]: I0110 17:40:26.862357 5036 generic.go:334] "Generic (PLEG): container finished" podID="79756361-741e-4470-831b-6ee092bc6277" containerID="bd5be67494c19f7db45abe4d993ffa6c70f6eb5a2fe59a988ba20bdfb387a56a" exitCode=0 Jan 10 17:40:26 crc kubenswrapper[5036]: I0110 17:40:26.862451 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerDied","Data":"bd5be67494c19f7db45abe4d993ffa6c70f6eb5a2fe59a988ba20bdfb387a56a"} Jan 10 17:40:26 crc kubenswrapper[5036]: I0110 17:40:26.863560 5036 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kqphb" event={"ID":"79756361-741e-4470-831b-6ee092bc6277","Type":"ContainerStarted","Data":"69cb35f2346736a8fcbd2d2f289de9834af230405fc75358babab70687503929"} Jan 10 17:40:26 crc kubenswrapper[5036]: I0110 17:40:26.863619 5036 scope.go:117] "RemoveContainer" containerID="2a381b45caba58ec8e26418299e509a6248a1d821c6d11596cb561969df97b6e" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515130507447024453 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015130507450017362 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015130476505016513 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015130476506015464 5ustar corecore